In SAS® 9.4M9, when using SAS/ACCESS to Spark with the BULKLOAD=YES option to create a Databricks table, DATA step execution can fail with an error message similar to the following:
ERROR: Execute error: Error running query: [DELTA_EXCEED_CHAR_VARCHAR_LIMIT]
com.databricks.sql.transaction.tahoe.schema.DeltaInvariantViolation
Exception: DELTA_EXCEED_CHAR_VARCHAR_LIMIT] Value "OH"
exceeds char/varchar type length limitation. Failed check:
((STATE_1 IS NULL) OR (length(STATE_1) <= 1)).
This issue occurs due to an incorrect format being applied during the bulkload process.
To work around this issue, set the BULKLOAD=NO option in the SAS/ACCESS to Spark LIBNAME statement.