Using multi character delimiter while exporting SQL table as text file using python

I am trying to export SQL table as multi-character delimited text file in python. I have tried using Pandas, but it is only supporting single character as delimiter. Now I am using NumPy module to export table data with my custom delimiter ‘|***|’. Efficiency of this function is very low in large tables and this will also crash memory for large tables.

Main concerns: Able to use multi-character delimiter while exporting table with effective memory usage

NB: Since numpy.savetxt() does not support whole table import at once, header and data part are separately loaded and added into a text file.

Any alternative solution to this in python or in other language is highly appreciated!

Adding the code now I use:

    x = pd.read_sql('SELECT * FROM [' + database + '].[' + schema + '].[' + table_name + ']',connection)
    cols = x.columns
    header = "|***|".join(cols)
    numpy.savetxt(table_data, x, fmt = "%s|***|%s|***|%s|***|%s|***|%s")
    with open(table_data, 'r') as partial : df_data = partial.read()
    with open(table_data, 'w') as whole : whole.write(header + "\n" + df_data)
    print("Exported ", table_data)

You don’t need to open the file, then write it again with the header. I don’t know numpy at all but it has a header arg for savetxt

numpy.savetxt( *fname* , *X* , *fmt='%.18e'* , *delimiter=' '* , *newline='\n'* , *header=''* , *footer=''* , *comments='# '* , *encoding=None* )

docs say:

header: str, optional

String that will be written at the beginning of the file.

Anyway, if you are running into memory problems, you might try to load your data as a generator, then write to file in append mode.

This topic was automatically closed 182 days after the last reply. New replies are no longer allowed.