Skip to content

Commit d01bb95

Browse files
authored
Merge pull request #15 from Benjamin-Knight/remove_fabric_dependency
Remove fabric dependency
2 parents bcf4ac9 + 1d30b38 commit d01bb95

71 files changed

Lines changed: 3856 additions & 321 deletions

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

.github/workflows/integration-tests-sqlserver.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@ jobs:
1818
name: Regular
1919
strategy:
2020
matrix:
21-
python_version: ["3.9", "3.10", "3.11", "3.12"]
21+
python_version: ["3.9", "3.10", "3.11", "3.12", "3.13"]
2222
msodbc_version: ["17", "18"]
2323
sqlserver_version: ["2017", "2019", "2022"]
2424
collation: ["SQL_Latin1_General_CP1_CS_AS", "SQL_Latin1_General_CP1_CI_AS"]

.github/workflows/publish-docker.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ jobs:
1212
publish-docker-client:
1313
strategy:
1414
matrix:
15-
python_version: ["3.9", "3.10", "3.11", "3.12"]
15+
python_version: ["3.9", "3.10", "3.11", "3.12", "3.13"]
1616
docker_target: ["msodbc17", "msodbc18"]
1717
runs-on: ubuntu-latest
1818
permissions:

.github/workflows/unit-tests.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@ jobs:
1818
name: Unit tests
1919
strategy:
2020
matrix:
21-
python_version: ["3.9", "3.10", "3.11", "3.12"]
21+
python_version: ["3.9", "3.10", "3.11", "3.12", "3.13"]
2222
runs-on: ubuntu-latest
2323
permissions:
2424
contents: read

Makefile

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,5 @@
11
.DEFAULT_GOAL:=help
2+
THREADS ?= auto
23

34
.PHONY: dev
45
dev: ## Installs adapter in develop mode along with development dependencies
@@ -44,7 +45,7 @@ unit: ## Runs unit tests.
4445
.PHONY: functional
4546
functional: ## Runs functional tests.
4647
@\
47-
pytest -n auto -ra -v tests/functional
48+
pytest -n $(THREADS) -ra -v tests/functional
4849

4950
.PHONY: test
5051
test: ## Runs unit tests and code checks against staged changes.

README.md

Lines changed: 27 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -49,6 +49,33 @@ pip install -U --pre dbt-sqlserver
4949

5050
See [the changelog](CHANGELOG.md)
5151

52+
## Configuration
53+
54+
### Flags
55+
56+
- `dbt_sqlserver_use_default_schema_concat`: *(default: `false`)* Controls schema name generation when a [custom schema](https://docs.getdbt.com/docs/build/custom-schemas) is set on a model.
57+
58+
| Flag value | `custom_schema_name` | Result |
59+
|---|---|---|
60+
| `false` (default, legacy) | *(none)* | `target.schema` |
61+
| `false` (default, legacy) | `"reporting"` | `reporting` |
62+
| `true` (dbt-core standard) | *(none)* | `target.schema` |
63+
| `true` (dbt-core standard) | `"reporting"` | `target.schema_reporting` |
64+
65+
When `false` (the default), the adapter uses its legacy behaviour: `custom_schema_name` is used **as-is** without being prefixed by `target.schema`.
66+
When `true`, the adapter delegates to dbt-core's `default__generate_schema_name`, which concatenates `target.schema` + `_` + `custom_schema_name`.
67+
68+
**Example usage in `dbt_project.yml`:**
69+
70+
```yaml
71+
vars:
72+
dbt_sqlserver_use_default_schema_concat: true # Enable standard schema concatenation
73+
```
74+
75+
> **Note:** If you want to permanently customise schema generation and avoid any future deprecation of this flag, override the `sqlserver__generate_schema_name` macro directly in your project.
76+
77+
78+
5279
## Contributing
5380

5481
[![Unit tests](https://github.com/dbt-msft/dbt-sqlserver/actions/workflows/unit-tests.yml/badge.svg)](https://github.com/dbt-msft/dbt-sqlserver/actions/workflows/unit-tests.yml)

dbt/adapters/sqlserver/__init__.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@
1111
adapter=SQLServerAdapter,
1212
credentials=SQLServerCredentials,
1313
include_path=sqlserver.PACKAGE_PATH,
14-
dependencies=["fabric"],
14+
dependencies=[],
1515
)
1616

1717
__all__ = [

dbt/adapters/sqlserver/sqlserver_adapter.py

Lines changed: 238 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -1,24 +1,45 @@
1-
from typing import Optional
1+
from typing import List, Optional
22

3-
import dbt.exceptions
3+
import agate
4+
import dbt_common.exceptions
5+
from dbt.adapters.base.column import Column as BaseColumn
46
from dbt.adapters.base.impl import ConstraintSupport
5-
from dbt.adapters.fabric import FabricAdapter
6-
from dbt.contracts.graph.nodes import ConstraintType
7+
from dbt.adapters.base.meta import available
8+
from dbt.adapters.base.relation import BaseRelation
9+
from dbt.adapters.capability import Capability, CapabilityDict, CapabilitySupport, Support
10+
from dbt.adapters.events.types import SchemaCreation
11+
from dbt.adapters.reference_keys import _make_ref_key_dict
12+
from dbt.adapters.sql.impl import CREATE_SCHEMA_MACRO_NAME, SQLAdapter
13+
from dbt_common.behavior_flags import BehaviorFlag
14+
from dbt_common.contracts.constraints import (
15+
ColumnLevelConstraint,
16+
ConstraintType,
17+
ModelLevelConstraint,
18+
)
19+
from dbt_common.events.functions import fire_event
720

821
from dbt.adapters.sqlserver.sqlserver_column import SQLServerColumn
22+
from dbt.adapters.sqlserver.sqlserver_configs import SQLServerConfigs
923
from dbt.adapters.sqlserver.sqlserver_connections import SQLServerConnectionManager
1024
from dbt.adapters.sqlserver.sqlserver_relation import SQLServerRelation
1125

1226

13-
class SQLServerAdapter(FabricAdapter):
27+
class SQLServerAdapter(SQLAdapter):
1428
"""
1529
Controls actual implmentation of adapter, and ability to override certain methods.
1630
"""
1731

1832
ConnectionManager = SQLServerConnectionManager
1933
Column = SQLServerColumn
34+
AdapterSpecificConfigs = SQLServerConfigs
2035
Relation = SQLServerRelation
2136

37+
_capabilities: CapabilityDict = CapabilityDict(
38+
{
39+
Capability.SchemaMetadataByRelations: CapabilitySupport(support=Support.Full),
40+
Capability.TableLastModifiedMetadata: CapabilitySupport(support=Support.Full),
41+
}
42+
)
2243
CONSTRAINT_SUPPORT = {
2344
ConstraintType.check: ConstraintSupport.ENFORCED,
2445
ConstraintType.not_null: ConstraintSupport.ENFORCED,
@@ -27,13 +48,196 @@ class SQLServerAdapter(FabricAdapter):
2748
ConstraintType.foreign_key: ConstraintSupport.ENFORCED,
2849
}
2950

51+
@property
52+
def _behavior_flags(self) -> List[BehaviorFlag]:
53+
return [
54+
{
55+
"name": "empty",
56+
"default": False,
57+
"description": (
58+
"When enabled, table and view materializations will be created as empty "
59+
"structures (no data)."
60+
),
61+
},
62+
{
63+
"name": "dbt_sqlserver_use_default_schema_concat",
64+
"default": False,
65+
"description": (
66+
"When True, uses dbt-core's standard schema concatenation "
67+
"(`target.schema` + `_` + `custom_schema_name`). "
68+
"When False (default), uses legacy adapter behaviour: "
69+
"`custom_schema_name` is used directly without prefixing `target.schema`. "
70+
"For a permanent solution, override the `sqlserver__generate_schema_name` "
71+
"macro in your project instead."
72+
),
73+
},
74+
]
75+
76+
@available.parse(lambda *a, **k: [])
77+
def get_column_schema_from_query(self, sql: str) -> List[BaseColumn]:
78+
"""Get a list of the Columns with names and data types from the given sql."""
79+
_, cursor = self.connections.add_select_query(sql)
80+
81+
columns = [
82+
self.Column.create(
83+
column_name, self.connections.data_type_code_to_name(column_type_code)
84+
)
85+
# https://peps.python.org/pep-0249/#description
86+
for column_name, column_type_code, *_ in cursor.description
87+
]
88+
return columns
89+
90+
@classmethod
91+
def convert_boolean_type(cls, agate_table, col_idx):
92+
return "bit"
93+
94+
@classmethod
95+
def convert_datetime_type(cls, agate_table, col_idx):
96+
return "datetime2(6)"
97+
3098
@classmethod
31-
def render_model_constraint(cls, constraint) -> Optional[str]:
99+
def convert_number_type(cls, agate_table, col_idx):
100+
decimals = agate_table.aggregate(agate.MaxPrecision(col_idx))
101+
return "float" if decimals else "int"
102+
103+
def create_schema(self, relation: BaseRelation) -> None:
104+
relation = relation.without_identifier()
105+
fire_event(SchemaCreation(relation=_make_ref_key_dict(relation)))
106+
macro_name = CREATE_SCHEMA_MACRO_NAME
107+
kwargs = {
108+
"relation": relation,
109+
}
110+
111+
if self.config.credentials.schema_authorization:
112+
kwargs["schema_authorization"] = self.config.credentials.schema_authorization
113+
macro_name = "sqlserver__create_schema_with_authorization"
114+
115+
self.execute_macro(macro_name, kwargs=kwargs)
116+
self.commit_if_has_connection()
117+
118+
@classmethod
119+
def convert_text_type(cls, agate_table, col_idx):
120+
column = agate_table.columns[col_idx]
121+
# see https://github.com/fishtown-analytics/dbt/pull/2255
122+
lens = [len(d.encode("utf-8")) for d in column.values_without_nulls()]
123+
max_len = max(lens) if lens else 64
124+
length = max_len if max_len > 16 else 16
125+
return "varchar({})".format(length)
126+
127+
@classmethod
128+
def convert_time_type(cls, agate_table, col_idx):
129+
return "time(6)"
130+
131+
@classmethod
132+
def date_function(cls):
133+
return "getdate()"
134+
135+
# Methods used in adapter tests
136+
def timestamp_add_sql(self, add_to: str, number: int = 1, interval: str = "hour") -> str:
137+
# note: 'interval' is not supported for T-SQL
138+
# for backwards compatibility, we're compelled to set some sort of
139+
# default. A lot of searching has lead me to believe that the
140+
# '+ interval' syntax used in postgres/redshift is relatively common
141+
# and might even be the SQL standard's intention.
142+
return f"DATEADD({interval},{number},{add_to})"
143+
144+
def string_add_sql(
145+
self,
146+
add_to: str,
147+
value: str,
148+
location="append",
149+
) -> str:
150+
"""
151+
`+` is T-SQL's string concatenation operator
152+
"""
153+
if location == "append":
154+
return f"{add_to} + '{value}'"
155+
elif location == "prepend":
156+
return f"'{value}' + {add_to}"
157+
else:
158+
raise ValueError(f'Got an unexpected location value of "{location}"')
159+
160+
def get_rows_different_sql(
161+
self,
162+
relation_a: BaseRelation,
163+
relation_b: BaseRelation,
164+
column_names: Optional[List[str]] = None,
165+
except_operator: str = "EXCEPT",
166+
) -> str:
167+
"""
168+
note: using is not supported on Synapse so COLUMNS_EQUAL_SQL is adjsuted
169+
Generate SQL for a query that returns a single row with a two
170+
columns: the number of rows that are different between the two
171+
relations and the number of mismatched rows.
172+
"""
173+
# This method only really exists for test reasons.
174+
names: List[str]
175+
if column_names is None:
176+
columns = self.get_columns_in_relation(relation_a)
177+
names = sorted((self.quote(c.name) for c in columns))
178+
else:
179+
names = sorted((self.quote(n) for n in column_names))
180+
columns_csv = ", ".join(names)
181+
182+
if columns_csv == "":
183+
columns_csv = "*"
184+
185+
sql = COLUMNS_EQUAL_SQL.format(
186+
columns=columns_csv,
187+
relation_a=str(relation_a),
188+
relation_b=str(relation_b),
189+
except_op=except_operator,
190+
)
191+
192+
return sql
193+
194+
def valid_incremental_strategies(self):
195+
"""The set of standard builtin strategies which this adapter supports out-of-the-box.
196+
Not used to validate custom strategies defined by end users.
197+
"""
198+
return ["append", "delete+insert", "merge", "microbatch"]
199+
200+
# This is for use in the test suite
201+
def run_sql_for_tests(self, sql, fetch, conn):
202+
cursor = conn.handle.cursor()
203+
try:
204+
cursor.execute(sql)
205+
if not fetch:
206+
conn.handle.commit()
207+
if fetch == "one":
208+
return cursor.fetchone()
209+
elif fetch == "all":
210+
return cursor.fetchall()
211+
else:
212+
return
213+
except BaseException:
214+
if conn.handle and not getattr(conn.handle, "closed", True):
215+
conn.handle.rollback()
216+
raise
217+
finally:
218+
conn.transaction_open = False
219+
220+
@available
221+
@classmethod
222+
def render_column_constraint(cls, constraint: ColumnLevelConstraint) -> Optional[str]:
223+
rendered_column_constraint = None
224+
if constraint.type == ConstraintType.not_null:
225+
rendered_column_constraint = "not null "
226+
else:
227+
rendered_column_constraint = ""
228+
229+
if rendered_column_constraint:
230+
rendered_column_constraint = rendered_column_constraint.strip()
231+
232+
return rendered_column_constraint
233+
234+
@classmethod
235+
def render_model_constraint(cls, constraint: ModelLevelConstraint) -> Optional[str]:
32236
constraint_prefix = "add constraint "
33237
column_list = ", ".join(constraint.columns)
34238

35239
if constraint.name is None:
36-
raise dbt.exceptions.DbtDatabaseError(
240+
raise dbt_common.exceptions.DbtDatabaseError(
37241
"Constraint name cannot be empty. Provide constraint name - column "
38242
+ column_list
39243
+ " and run the project again."
@@ -56,12 +260,31 @@ def render_model_constraint(cls, constraint) -> Optional[str]:
56260
else:
57261
return None
58262

59-
@classmethod
60-
def date_function(cls):
61-
return "getdate()"
62263

63-
def valid_incremental_strategies(self):
64-
"""The set of standard builtin strategies which this adapter supports out-of-the-box.
65-
Not used to validate custom strategies defined by end users.
66-
"""
67-
return ["append", "delete+insert", "merge", "microbatch"]
264+
COLUMNS_EQUAL_SQL = """
265+
with diff_count as (
266+
SELECT
267+
1 as id,
268+
COUNT(*) as num_missing FROM (
269+
(SELECT {columns} FROM {relation_a} {except_op}
270+
SELECT {columns} FROM {relation_b})
271+
UNION ALL
272+
(SELECT {columns} FROM {relation_b} {except_op}
273+
SELECT {columns} FROM {relation_a})
274+
) as a
275+
), table_a as (
276+
SELECT COUNT(*) as num_rows FROM {relation_a}
277+
), table_b as (
278+
SELECT COUNT(*) as num_rows FROM {relation_b}
279+
), row_count_diff as (
280+
select
281+
1 as id,
282+
table_a.num_rows - table_b.num_rows as difference
283+
from table_a, table_b
284+
)
285+
select
286+
row_count_diff.difference as row_count_difference,
287+
diff_count.num_missing as num_mismatched
288+
from row_count_diff
289+
join diff_count on row_count_diff.id = diff_count.id
290+
""".strip()

0 commit comments

Comments
 (0)