Problem with Python Logging - empty .log files get created that I don't want

Hi there!

I’m having a bit of a problem regarding Python logging that I can’t figure out by myself. But I’m pretty sure that I’m just missing a small detail…

So the problem is the following:
I have a python module (effective_etl.py) where I define the logging config like this:

# effective_etl.py
import logging
import math
import re
from datetime import date

import numpy as np
import pandas as pd
import teradata
from google.oauth2 import service_account
from pandas.io import gbq

import credentials

# Configure logging
def get_logger(name):
    logger = logging.getLogger(name)
    if not logger.handlers:
        logger.propagate = 0
        console = logging.StreamHandler()
        logger.addHandler(console)
        formatter = logging.Formatter('%(asctime)s - %(levelname)s - %(name)s - Code line %(lineno)d - %(message)s')
        console.setFormatter(formatter)
    return logger

logger = get_logger(__name__)
logger.setLevel(logging.INFO)

td_logger = get_logger('teradata')
td_logger.setLevel(logging.ERROR)

gbq_logger = get_logger('pandas_gbq.gbq')
gbq_logger.setLevel(logging.ERROR)
#...

And I’ll use logger later on in this module in a bunch of functions like that for example:

def execute_selects(select_stmt, session_name, where_params=None):
    """Executes a select statement against the given connection and returns query result in a dataframe

    Parameters
    ----------
    select_stmt : String, containing the query.
    session_name : String, that is used when opening a session to the database.
    where_params : (Optional) Dictionary, containing possible parameters for the where clause that get substituted.

    Returns
    -------
    df : Dataframe with the query result set
    """
    prepared_stmt = _prepare_stmt(select_stmt, where_params)

    try:
        with _build_connection_object(session_name) as session:
            df = pd.read_sql(prepared_stmt, session)
            logger.info(f'Execution successful: {df.shape[0]} rows were loaded into memory.')

    except Exception:
        logger.error(f'Exception occured: Unable to execute select {prepared_stmt}.')
        raise

    return df

I’ll then import this module in another Python script like that:

# other_script.py
import effective_etl as ee

And use the functions from effective_etl like that (just an example:

df_tablenames_ktnr = ee.execute_selects(select_stmt=select_dbc_columns_ktnr,
                                    session_name=SESSION_NAME,
                                    )

And it all works smoothly and it logs to the console perfectly fine (I want that!) but unfortunately the script (other_script.py) also produces empty .log files when its ran. And THAT I don’t want. But I have no idea why these log files are getting produced.

Has anyone any ideas/tips?

Greatly appreciated.