Contrary to what you would expect, mysqlimport does not allow importing TSV’s from standard in. This in an issue if you want to load a gzipped into a database table. The following script works around this shortcoming:

zcat file.tsv.gz | \
    mysql "$databasename" -e "LOAD DATA LOCAL INFILE '/dev/stdin' INTO TABLE $tablename"

Showing warnings,

Another functionality that mysqlimport is missing is reporting warning in full. If warnings occur during loading you get is “1337 warnings”, which does not help you pinpoint the problem. To work around this issue, you can append SHOW WARNINGS at the end of the command (after a ;).

Loading in binary data

There is no support for loading in binary data, but there is a simple workaround. If your TSV contains a column with binary data as hex the following will work.

zcat file.tsv.gz | \
    mysql "$databasename" -e "
            LOAD DATA LOCAL INFILE '/dev/stdin'
            INTO TABLE targettable (col1, col2, @hex3, col4)
            SET col3 = UNHEX(@hex3);
            SHOW WARNINGS;"

Rather than being loaded into the table, the third column will be placed in a variable that is later used to set the value of the third column.

Reminder: The following command transforms data to a hex.

... | xxd -p | tr -d "\n"

Storing a compressed column

Instead of using UNHEX, all kinds of operations can be done. A nice example is COMPRESS. Using a similar piece of code, you can compress data form your TSV with gzip.

zcat file.tsv.gz | \
    mysql "$databasename" -e "
            LOAD DATA LOCAL INFILE '/dev/stdin'
            INTO TABLE targettable (col1, col2, @large3, col4)
            SET col3 = COMPRESS(@large3);
            SHOW WARNINGS;"

Want to compress binary data? Combine the last 2: COMPRESS(UNHEX(…)).

Sources

This solution was inspired by the stackoverflow post “MySQL import from stdin”. The best answer unfortunately did not have the majority of the up-votes at the time of writing this.