Noyau applicatif (core)
Internal logging utility for fabrictools.
- fabrictools.core.logging.log(message: str, level: str = 'info') None[source]
Emit a timestamped line on the
fabrictoolslogger.
Path resolution helpers for Microsoft Fabric resources.
Lakehouse helpers accept slash paths and SQL-style schema.table (first dot only)
when the string has no path separators, e.g. dbo.PdC Extraction → dbo/PdC Extraction.
- fabrictools.core.paths.build_lakehouse_read_path_candidates(relative_path: str) List[str][source]
Build ordered candidate relative paths for Lakehouse reads.
Normalizes slashes, maps SQL-style
schema.table(e.g.dbo.PdC Extraction) toschema/tablewhen there is no slash, then may prependTables/dboorFileswhen the path omits those prefixes (Fabric layout).
- fabrictools.core.paths.build_lakehouse_write_path(relative_path: str) str[source]
Normalize a Lakehouse write path (
Tables/dbo/...orFiles/...).Accepts SQL-style
schema.table(e.g.dbo.PdC Extraction) when there is no slash; it is mapped toschema/tablebefore applying Fabric layout rules.
- fabrictools.core.paths.get_lakehouse_abfs_path(lakehouse_name: str) str[source]
Resolve the full ABFS base path for a Lakehouse display name.
- Parameters:
lakehouse_name (str) – Lakehouse name as shown in Fabric.
- Returns:
abfsPathfrom lakehouse properties.- Return type:
- Raises:
ValueError – If
notebookutilsis missing or resolution fails.
- fabrictools.core.paths.get_warehouse_jdbc_url(warehouse_name: str) str[source]
Build a JDBC URL for a Fabric Warehouse (SQL endpoint + database).
- Parameters:
warehouse_name (str) – Warehouse display name in Fabric.
- Returns:
JDBC URL string for Spark
jdbcformat reads/writes.- Return type:
- Raises:
ValueError – If
notebookutilsis missing or resolution fails.
SparkSession accessor for fabrictools.
- fabrictools.core.spark.get_spark() pyspark.sql.SparkSession[source]
Return the active
SparkSession, creating one if none exists.- Returns:
Current or newly built session.
- Return type:
SparkSession
Core shared utilities for fabrictools.
Exports logging, Spark session access, and Fabric path / JDBC resolution helpers.
See submodules fabrictools.core.logging, fabrictools.core.paths,
fabrictools.core.spark.
- fabrictools.core.build_lakehouse_read_path_candidates(relative_path: str) List[str][source]
Build ordered candidate relative paths for Lakehouse reads.
Normalizes slashes, maps SQL-style
schema.table(e.g.dbo.PdC Extraction) toschema/tablewhen there is no slash, then may prependTables/dboorFileswhen the path omits those prefixes (Fabric layout).
- fabrictools.core.build_lakehouse_write_path(relative_path: str) str[source]
Normalize a Lakehouse write path (
Tables/dbo/...orFiles/...).Accepts SQL-style
schema.table(e.g.dbo.PdC Extraction) when there is no slash; it is mapped toschema/tablebefore applying Fabric layout rules.
- fabrictools.core.get_lakehouse_abfs_path(lakehouse_name: str) str[source]
Resolve the full ABFS base path for a Lakehouse display name.
- Parameters:
lakehouse_name (str) – Lakehouse name as shown in Fabric.
- Returns:
abfsPathfrom lakehouse properties.- Return type:
- Raises:
ValueError – If
notebookutilsis missing or resolution fails.
- fabrictools.core.get_spark() pyspark.sql.SparkSession[source]
Return the active
SparkSession, creating one if none exists.- Returns:
Current or newly built session.
- Return type:
SparkSession
- fabrictools.core.get_warehouse_jdbc_url(warehouse_name: str) str[source]
Build a JDBC URL for a Fabric Warehouse (SQL endpoint + database).
- Parameters:
warehouse_name (str) – Warehouse display name in Fabric.
- Returns:
JDBC URL string for Spark
jdbcformat reads/writes.- Return type:
- Raises:
ValueError – If
notebookutilsis missing or resolution fails.