dbldatagen.daterange module
- class DateRange(begin, end, interval=None, datetime_format='%Y-%m-%d %H:%M:%S')[source]
Bases:
DataRange
Class to represent Date range
The date range will represented internally using datetime for start and end, and timedelta for interval
When computing ranges for purposes of the sequences, the maximum value will be adjusted to the nearest whole multiple of the interval that is before the end value.
When converting from a string, datetime is assumed to use local timezone unless specified as part of the format in keeping with the python datetime handling of datetime instances that do not specify a timezone
- Parameters:
begin – start of date range as python datetime object. If specified as string, converted to datetime
end – end of date range as python datetime object. If specified as string, converted to datetime
interval – interval of date range as python timedelta object. Note parsing format for interval uses standard timedelta parsing not the datetime_format string
datetime_format – format for conversion of strings to datetime objects
- DEFAULT_DATE_FORMAT = '%Y-%m-%d'
- DEFAULT_END_DATE = datetime.date(2023, 12, 31)
- DEFAULT_END_DATE_TIMESTAMP = datetime.datetime(2023, 12, 31, 0, 0)
- DEFAULT_END_TIMESTAMP = datetime.datetime(2023, 12, 31, 23, 59, 59)
- DEFAULT_START_DATE = datetime.date(2023, 1, 1)
- DEFAULT_START_DATE_TIMESTAMP = datetime.datetime(2023, 1, 1, 0, 0)
- DEFAULT_START_TIMESTAMP = datetime.datetime(2023, 1, 1, 0, 0)
- DEFAULT_UTC_TS_FORMAT = '%Y-%m-%d %H:%M:%S'
- adjustForColumnDatatype(ctype)[source]
adjust the range for the column output type
- Parameters:
ctype – Spark SQL data type for column
- computeTimestampIntervals(start, end, interval)[source]
Compute number of intervals between start and end date