Import data into MySQL table with Python and MySQL

This script I created as a simple means to step through the files found in a
directory and import them into a MySQL Table via a cron job. The files are
placed into a specific directory by another task. Each file was compressed via
tar with gzip and encrypted with gpg. So, we will need to unecrypt each file and
extract the contents so we may import the sql data.

You will need to have installed and configured on your system:

The code:

#!/usr/bin/python3

import os
import sys
from os import walk

gpgKey = '%%passphrase%%'
thisTable = '%%table_name%%'

filesPath = '/data/incoming/'
files = []
for ( dirpath, dirnames, filenames ) in walk( filesPath ):
    files.extend( filenames )
    break

for file in files:

    fileWithPath = filesPath + file

    # drop table if exist
    tbl = file.replace( '.sql.gz.gpg', '' )
    cmd = "mysql " + thisTable + " -e 'DROP TABLE `" + tbl + "`'"
    os.system( cmd )

    # extract encrypted compressed file into temp dir
    cmd = "gpg --decrypt --passphrase \'" + gpgKey + "\' --no-use-agent --cipher-algo AES256 " + fileWithPath + " | gunzip > /data/tmp/working.sql"
    os.system( cmd )

    # import data into MySQL database
    cmd = "mysql " + thisTable + " < /data/tmp/working.sql"
    os.system( cmd )

    # remove temp file
    os.remove( '/data/tmp/working.sql' )

    # remove uploaded file
    os.remove( fileWithPath )