View Issue Details

IDProjectCategoryView StatusLast Update
0000888Database Workbench v4Data Pumppublic2014-04-29 20:47
ReporterMartijn Tonies Assigned ToMartijn Tonies  
PrioritynormalSeverityminorReproducibilityhave not tried
Status closedResolutionfixed 
Product Version4.4.5 
Fixed in Version4.4.6 
Summary0000888: DataPump fails on CSV file with large (char) values converted to BigInt (Firebird)
DescriptionDataPump fails on CSV file with large (char) values converted to BigInt (Firebird)
Steps To ReproduceSee test case files by Ivan Rakyta.

Source files: http://geolite.maxmind.com/download/geoip/database/GeoLiteCity_CSV/GeoLiteCity-latest.zip
It fails when value is 80000000 hex.
I managed to import as string and then convert to Bigint with no problem.

I created ODBC system DSN datasource inside Database Workbench, driver - Microsoft Access Text Driver (*.txt,*csv) and pointed directory to folder where I expanded the zip file. The error happens with GeoLiteCity-Blocks.csv. If you create firebird table with 2 bigint and 1 int column and map from csv (locid is integer) and then just start, you will get error after 1.3 million lines approximately (total 1.8 million lines). You might want to remove first line from csv first (I used Notepad++).
TagsNo tags attached.
DBMS & Version

Activities

There are no notes attached to this issue.

Issue History

Date Modified Username Field Change
2014-04-11 15:26 Martijn Tonies New Issue
2014-04-11 15:26 Martijn Tonies Status new => confirmed
2014-04-11 15:27 Martijn Tonies Steps to Reproduce Updated
2014-04-11 17:12 Martijn Tonies Assigned To => Martijn Tonies
2014-04-11 17:12 Martijn Tonies Status confirmed => assigned
2014-04-11 17:17 Martijn Tonies Status assigned => resolved
2014-04-11 17:17 Martijn Tonies Fixed in Version => 4.4.6
2014-04-11 17:17 Martijn Tonies Resolution open => fixed
2014-04-29 20:47 Martijn Tonies Status resolved => closed