Release Notes

Pancake V1.29 Release Notes

Data Source Overview

No Updates

Manage Data Source

No Updates

Manage Scan Configuration

New - Configuration Settings Calculator

Allows a user to gather diagnostic information about the data source to retrieve data used in the Calculatorโ€™s variables. Based on the warehouse size chosen, the calculator will produce values for the Number of Procedure Calls, Record Count Per Procedure Call and Record Count Per Thread Worker Process that can be used for the required configuration settings. The calculator will help the user to avoid trying to process too much information at one time and either running out of memory or exceeding the maximum timeout of 60 minutes per procedure call as set by Snowflake.

Record Count Per Thread Worker Process - Position Moved

The โ€œRecord Count Per Thread Worker Processโ€ has been moved below the โ€œRecord Count Per Procedure Callโ€ textbox.

Bug Fix - Checkbox Value Reset

Fixed a bug where sometimes the checkbox values for Enable Schedule and Auto Code Generate were not getting reset after the creation of a new scan configuration.

Scan Data Source

Default Value for Precision and Scale

During the discovery of an attributeโ€™s float polymorphic version the metadata will use a default value for the precision of 38 and a default value of 10 for the scale. The user can still modify this value post scan.

Datetime Inference for ISO date update

ISO datetime values with time zones and seconds represented with a โ€œ+โ€ will now be recognized and set to use the TIMESTAMP_TZ Snowflake Data Type.

Enhanced error handling for large datasets

Out of memory errors can occur when using a warehouse size that is too small for the amount of data being processed. The new error handling will report these types of messages and make it easier to determine when to use a larger warehouse size.

Data Source Attributes

New attribute metadata fields to determine whether a custom datetime format should be used

Allows for the configuration of whether a custom datetime format should be used as part of the sql code gen process. If the user disables the datetime format then the standard TRY_TO_? Function will be used in the Dynamic Table column definition without specifying a datetime format resulting in the use of the AUTO inference. Find more information related to the AUTO datetime inference in the Snowflake documentation. https://docs.snowflake.com/en/sql-reference/data-types-datetime#date-and-time-formats

Datetime Format now supports multiple formats

Allows for a comma separated list of datetime formats for use when an attribute may have values using multiple datetime formats.

Rearrangement of grid columns

Columns have been rearranged in the editable attributes grid to make it easier to access the columns that can be updated.

Updated Attribute Filter

The filter now supports filtering the source data type or the Snowflake data type.

Dynamic Table SQL Generation

Numeric Value Flattening Code for Invalid Characters

If an attribute has embedded or stringified JSON that has a polymorphic version using an int or float data type, the sql generated will now perform a conversion from the variant value to VARCHAR and then will then use a TRY_TO_? Function to determine if the value can be converted to the appropriate data type. Some values are recognized as a Snowflake INTEGER or DECIMAL using the TYPEOF function but cannot be converted to the respective Snowflake data type.

Bug Fix - Embedded JSON with โ€œnullโ€ values

The generated code will now attempt to parse the embedded or stringified JSON using the TRY_PARSE_JSON. If the embedded JSON contains the value of โ€œnullโ€ the resulting value for that column will be NULL and not cause a Dynamic Table deployment execution error.

Multiple datetime formats for a single column

The code generation for a string using a datetime inference containing multiple formats will use the COALESCE function combined with the TRY_TO_? Function and will use the first one that produces a non null value. If all datetime formats result in a NULL value then the value returned for that specific row will be NULL.

User configurable use of a custom datetime format

If the user disabled the use of the datetime format the standard TRY_TO_? Function will be used without specifying a datetime format resulting in the use of the AUTO datetime inference.

Worksheet Commands

No Updates

Previous Releases

V1.26

  1. support decimal data type in addition to float - Allows for discovery of values larger than what can be supported with a python float data type.

  2. deep scan to discover embedded json in primitive arrays - Allows a user to specify that allow values in arrays should be scanned to look for polymorphic versions. Currently only the first value in the array for each record is scanned.

V1.25

  1. add null value configuration - Allows users to configure what value they want to use in the case of a null value in a record.

  2. add polymorphic column alias name - Allows a user to create an alternate custom name for a column in the Dynamic Table in the case of an existing column name being too long or not named well.

  3. add dropdowns for database, schema, and objects on the manage datasource screen - Prevents users from typing a value that is not valid. Also allows users to see which database objects have been made available to pancake.

V1.24

  1. Kafka Support (UI, Scan, Generation)

  2. Auto Code Gen

  3. Bug Fix - Scan/Discovery - Attribute names that contain periods

  4. Bug Fix - Code Gen - Primitive array dynamic tables with decimal data types - now includes prec/scale

Past Releases

V1.22

  1. Embedded JSON Discovery and Flattening

V1.21

  1. progress updating in scans in process for single procedure call

  2. new status update when scanning begins

  3. moved % complete to column 4 in the scans in process update

V1.20

  1. if the billing event fails while changing the product tier, the product tier will revert back to what it was but the rest of the data source changes will be saved.

  2. support for datetime inference with am/pm and full support for all datetime formats for iso, us, and eu

  3. display current product tier in the manage data source page without having to click the change product tier button

  4. add toast messages to save

  5. add last_scan_polymorphic_count to data source for alert and add to overview

  6. polymorphic version records getting created for each scan for all array attributes even if they previously existed

  7. create alert for vw_datasources using new field last_scan_polymorphic_count

  8. add message to manage data source when connection fails. add grant statements

  9. add new or update to title in manage data source, manage warehouse and scan configuration

  10. add refresh button to data source overview screen

  11. add help text to filter textboxes in data source overview

  12. update readme to include scripts for setting up alerts

Last updated