Automating batch load of multiple GPX files into PostGIS?

For pure Python, use GDAL's OGR module:

import os
from osgeo import ogr
from glob import glob

# Establish a connection to a PostGIS database
pg = ogr.GetDriverByName('PostgreSQL')
if pg is None:
    raise RuntimeError('PostgreSQL driver not available')
conn = pg.Open("PG:dbname='postgis' user='postgres'", True)
if conn is None:
    raise RuntimeError('Cannot open dataset connection')

# Loop through each GPX file
for gpx_file in glob('/path/to/*.gpx'):
    ds = ogr.Open(gpx_file)
    if ds is None:
        print('Skipping ' + gpx_file)
    print('Opened ' + gpx_file)
    prefix = os.path.splitext(os.path.basename(gpx_file))[0]
    # Get each layer
    for iLayer in range(ds.GetLayerCount()):
        layer = ds.GetLayer(iLayer)
        layer_name = prefix + '_' + layer.GetName()
        if layer.GetFeatureCount() == 0:
            print(' -> Skipping ' + layer_name + ' since it is empty')
        else:
            print(' -> Copying ' + layer_name)
            pg_layer = conn.CopyLayer(layer, layer_name)
            if pg_layer is None:
                print(' |-> Failed to copy')

After additional research, I wrote my own gpx2postgis.py Python script that automates the process of appending GPX features to existing tables. The script uses portions of the work provided by @Mike T above, and others. I have added it to GitHub if you would like to download or make contributions. It creates new table schemas (as needed) based on the input GPX sublayers, and appends features to those tables.

While not a Python solution, I did come across this similar question on StackOverflow that made me realize I could just loop through all of my GPX files and call the ogr2ogr command line to process them using the GPX feature types.

ogr2ogr -append -f PostgreSQL "PG:dbname=your_db user=xxxx password=yyyy" filename.gpx