[3] | 1 | # mssql.py |
---|
| 2 | |
---|
| 3 | """Support for the Microsoft SQL Server database. |
---|
| 4 | |
---|
| 5 | Driver |
---|
| 6 | ------ |
---|
| 7 | |
---|
| 8 | The MSSQL dialect will work with three different available drivers: |
---|
| 9 | |
---|
| 10 | * *pyodbc* - http://pyodbc.sourceforge.net/. This is the recommeded |
---|
| 11 | driver. |
---|
| 12 | |
---|
| 13 | * *pymssql* - http://pymssql.sourceforge.net/ |
---|
| 14 | |
---|
| 15 | * *adodbapi* - http://adodbapi.sourceforge.net/ |
---|
| 16 | |
---|
| 17 | Drivers are loaded in the order listed above based on availability. |
---|
| 18 | |
---|
| 19 | If you need to load a specific driver pass ``module_name`` when |
---|
| 20 | creating the engine:: |
---|
| 21 | |
---|
| 22 | engine = create_engine('mssql://dsn', module_name='pymssql') |
---|
| 23 | |
---|
| 24 | ``module_name`` currently accepts: ``pyodbc``, ``pymssql``, and |
---|
| 25 | ``adodbapi``. |
---|
| 26 | |
---|
| 27 | Currently the pyodbc driver offers the greatest level of |
---|
| 28 | compatibility. |
---|
| 29 | |
---|
| 30 | Connecting |
---|
| 31 | ---------- |
---|
| 32 | |
---|
| 33 | Connecting with create_engine() uses the standard URL approach of |
---|
| 34 | ``mssql://user:pass@host/dbname[?key=value&key=value...]``. |
---|
| 35 | |
---|
| 36 | If the database name is present, the tokens are converted to a |
---|
| 37 | connection string with the specified values. If the database is not |
---|
| 38 | present, then the host token is taken directly as the DSN name. |
---|
| 39 | |
---|
| 40 | Examples of pyodbc connection string URLs: |
---|
| 41 | |
---|
| 42 | * *mssql://mydsn* - connects using the specified DSN named ``mydsn``. |
---|
| 43 | The connection string that is created will appear like:: |
---|
| 44 | |
---|
| 45 | dsn=mydsn;Trusted_Connection=Yes |
---|
| 46 | |
---|
| 47 | * *mssql://user:pass@mydsn* - connects using the DSN named |
---|
| 48 | ``mydsn`` passing in the ``UID`` and ``PWD`` information. The |
---|
| 49 | connection string that is created will appear like:: |
---|
| 50 | |
---|
| 51 | dsn=mydsn;UID=user;PWD=pass |
---|
| 52 | |
---|
| 53 | * *mssql://user:pass@mydsn/?LANGUAGE=us_english* - connects |
---|
| 54 | using the DSN named ``mydsn`` passing in the ``UID`` and ``PWD`` |
---|
| 55 | information, plus the additional connection configuration option |
---|
| 56 | ``LANGUAGE``. The connection string that is created will appear |
---|
| 57 | like:: |
---|
| 58 | |
---|
| 59 | dsn=mydsn;UID=user;PWD=pass;LANGUAGE=us_english |
---|
| 60 | |
---|
| 61 | * *mssql://user:pass@host/db* - connects using a connection string |
---|
| 62 | dynamically created that would appear like:: |
---|
| 63 | |
---|
| 64 | DRIVER={SQL Server};Server=host;Database=db;UID=user;PWD=pass |
---|
| 65 | |
---|
| 66 | * *mssql://user:pass@host:123/db* - connects using a connection |
---|
| 67 | string that is dynamically created, which also includes the port |
---|
| 68 | information using the comma syntax. If your connection string |
---|
| 69 | requires the port information to be passed as a ``port`` keyword |
---|
| 70 | see the next example. This will create the following connection |
---|
| 71 | string:: |
---|
| 72 | |
---|
| 73 | DRIVER={SQL Server};Server=host,123;Database=db;UID=user;PWD=pass |
---|
| 74 | |
---|
| 75 | * *mssql://user:pass@host/db?port=123* - connects using a connection |
---|
| 76 | string that is dynamically created that includes the port |
---|
| 77 | information as a separate ``port`` keyword. This will create the |
---|
| 78 | following connection string:: |
---|
| 79 | |
---|
| 80 | DRIVER={SQL Server};Server=host;Database=db;UID=user;PWD=pass;port=123 |
---|
| 81 | |
---|
| 82 | If you require a connection string that is outside the options |
---|
| 83 | presented above, use the ``odbc_connect`` keyword to pass in a |
---|
| 84 | urlencoded connection string. What gets passed in will be urldecoded |
---|
| 85 | and passed directly. |
---|
| 86 | |
---|
| 87 | For example:: |
---|
| 88 | |
---|
| 89 | mssql:///?odbc_connect=dsn%3Dmydsn%3BDatabase%3Ddb |
---|
| 90 | |
---|
| 91 | would create the following connection string:: |
---|
| 92 | |
---|
| 93 | dsn=mydsn;Database=db |
---|
| 94 | |
---|
| 95 | Encoding your connection string can be easily accomplished through |
---|
| 96 | the python shell. For example:: |
---|
| 97 | |
---|
| 98 | >>> import urllib |
---|
| 99 | >>> urllib.quote_plus('dsn=mydsn;Database=db') |
---|
| 100 | 'dsn%3Dmydsn%3BDatabase%3Ddb' |
---|
| 101 | |
---|
| 102 | Additional arguments which may be specified either as query string |
---|
| 103 | arguments on the URL, or as keyword argument to |
---|
| 104 | :func:`~sqlalchemy.create_engine()` are: |
---|
| 105 | |
---|
| 106 | * *auto_identity_insert* - enables support for IDENTITY inserts by |
---|
| 107 | automatically turning IDENTITY INSERT ON and OFF as required. |
---|
| 108 | Defaults to ``True``. |
---|
| 109 | |
---|
| 110 | * *query_timeout* - allows you to override the default query timeout. |
---|
| 111 | Defaults to ``None``. This is only supported on pymssql. |
---|
| 112 | |
---|
| 113 | * *text_as_varchar* - if enabled this will treat all TEXT column |
---|
| 114 | types as their equivalent VARCHAR(max) type. This is often used if |
---|
| 115 | you need to compare a VARCHAR to a TEXT field, which is not |
---|
| 116 | supported directly on MSSQL. Defaults to ``False``. |
---|
| 117 | |
---|
| 118 | * *use_scope_identity* - allows you to specify that SCOPE_IDENTITY |
---|
| 119 | should be used in place of the non-scoped version @@IDENTITY. |
---|
| 120 | Defaults to ``False``. On pymssql this defaults to ``True``, and on |
---|
| 121 | pyodbc this defaults to ``True`` if the version of pyodbc being |
---|
| 122 | used supports it. |
---|
| 123 | |
---|
| 124 | * *has_window_funcs* - indicates whether or not window functions |
---|
| 125 | (LIMIT and OFFSET) are supported on the version of MSSQL being |
---|
| 126 | used. If you're running MSSQL 2005 or later turn this on to get |
---|
| 127 | OFFSET support. Defaults to ``False``. |
---|
| 128 | |
---|
| 129 | * *max_identifier_length* - allows you to se the maximum length of |
---|
| 130 | identfiers supported by the database. Defaults to 128. For pymssql |
---|
| 131 | the default is 30. |
---|
| 132 | |
---|
| 133 | * *schema_name* - use to set the schema name. Defaults to ``dbo``. |
---|
| 134 | |
---|
| 135 | Auto Increment Behavior |
---|
| 136 | ----------------------- |
---|
| 137 | |
---|
| 138 | ``IDENTITY`` columns are supported by using SQLAlchemy |
---|
| 139 | ``schema.Sequence()`` objects. In other words:: |
---|
| 140 | |
---|
| 141 | Table('test', mss_engine, |
---|
| 142 | Column('id', Integer, |
---|
| 143 | Sequence('blah',100,10), primary_key=True), |
---|
| 144 | Column('name', String(20)) |
---|
| 145 | ).create() |
---|
| 146 | |
---|
| 147 | would yield:: |
---|
| 148 | |
---|
| 149 | CREATE TABLE test ( |
---|
| 150 | id INTEGER NOT NULL IDENTITY(100,10) PRIMARY KEY, |
---|
| 151 | name VARCHAR(20) NULL, |
---|
| 152 | ) |
---|
| 153 | |
---|
| 154 | Note that the ``start`` and ``increment`` values for sequences are |
---|
| 155 | optional and will default to 1,1. |
---|
| 156 | |
---|
| 157 | * Support for ``SET IDENTITY_INSERT ON`` mode (automagic on / off for |
---|
| 158 | ``INSERT`` s) |
---|
| 159 | |
---|
| 160 | * Support for auto-fetching of ``@@IDENTITY/@@SCOPE_IDENTITY()`` on |
---|
| 161 | ``INSERT`` |
---|
| 162 | |
---|
| 163 | Collation Support |
---|
| 164 | ----------------- |
---|
| 165 | |
---|
| 166 | MSSQL specific string types support a collation parameter that |
---|
| 167 | creates a column-level specific collation for the column. The |
---|
| 168 | collation parameter accepts a Windows Collation Name or a SQL |
---|
| 169 | Collation Name. Supported types are MSChar, MSNChar, MSString, |
---|
| 170 | MSNVarchar, MSText, and MSNText. For example:: |
---|
| 171 | |
---|
| 172 | Column('login', String(32, collation='Latin1_General_CI_AS')) |
---|
| 173 | |
---|
| 174 | will yield:: |
---|
| 175 | |
---|
| 176 | login VARCHAR(32) COLLATE Latin1_General_CI_AS NULL |
---|
| 177 | |
---|
| 178 | LIMIT/OFFSET Support |
---|
| 179 | -------------------- |
---|
| 180 | |
---|
| 181 | MSSQL has no support for the LIMIT or OFFSET keysowrds. LIMIT is |
---|
| 182 | supported directly through the ``TOP`` Transact SQL keyword:: |
---|
| 183 | |
---|
| 184 | select.limit |
---|
| 185 | |
---|
| 186 | will yield:: |
---|
| 187 | |
---|
| 188 | SELECT TOP n |
---|
| 189 | |
---|
| 190 | If the ``has_window_funcs`` flag is set then LIMIT with OFFSET |
---|
| 191 | support is available through the ``ROW_NUMBER OVER`` construct. This |
---|
| 192 | construct requires an ``ORDER BY`` to be specified as well and is |
---|
| 193 | only available on MSSQL 2005 and later. |
---|
| 194 | |
---|
| 195 | Nullability |
---|
| 196 | ----------- |
---|
| 197 | MSSQL has support for three levels of column nullability. The default |
---|
| 198 | nullability allows nulls and is explicit in the CREATE TABLE |
---|
| 199 | construct:: |
---|
| 200 | |
---|
| 201 | name VARCHAR(20) NULL |
---|
| 202 | |
---|
| 203 | If ``nullable=None`` is specified then no specification is made. In |
---|
| 204 | other words the database's configured default is used. This will |
---|
| 205 | render:: |
---|
| 206 | |
---|
| 207 | name VARCHAR(20) |
---|
| 208 | |
---|
| 209 | If ``nullable`` is ``True`` or ``False`` then the column will be |
---|
| 210 | ``NULL` or ``NOT NULL`` respectively. |
---|
| 211 | |
---|
| 212 | Date / Time Handling |
---|
| 213 | -------------------- |
---|
| 214 | For MSSQL versions that support the ``DATE`` and ``TIME`` types |
---|
| 215 | (MSSQL 2008+) the data type is used. For versions that do not |
---|
| 216 | support the ``DATE`` and ``TIME`` types a ``DATETIME`` type is used |
---|
| 217 | instead and the MSSQL dialect handles converting the results |
---|
| 218 | properly. This means ``Date()`` and ``Time()`` are fully supported |
---|
| 219 | on all versions of MSSQL. If you do not desire this behavior then |
---|
| 220 | do not use the ``Date()`` or ``Time()`` types. |
---|
| 221 | |
---|
| 222 | Compatibility Levels |
---|
| 223 | -------------------- |
---|
| 224 | MSSQL supports the notion of setting compatibility levels at the |
---|
| 225 | database level. This allows, for instance, to run a database that |
---|
| 226 | is compatibile with SQL2000 while running on a SQL2005 database |
---|
| 227 | server. ``server_version_info`` will always retrun the database |
---|
| 228 | server version information (in this case SQL2005) and not the |
---|
| 229 | compatibiility level information. Because of this, if running under |
---|
| 230 | a backwards compatibility mode SQAlchemy may attempt to use T-SQL |
---|
| 231 | statements that are unable to be parsed by the database server. |
---|
| 232 | |
---|
| 233 | Known Issues |
---|
| 234 | ------------ |
---|
| 235 | |
---|
| 236 | * No support for more than one ``IDENTITY`` column per table |
---|
| 237 | |
---|
| 238 | * pymssql has problems with binary and unicode data that this module |
---|
| 239 | does **not** work around |
---|
| 240 | |
---|
| 241 | """ |
---|
| 242 | import datetime, decimal, inspect, operator, re, sys, urllib |
---|
| 243 | |
---|
| 244 | from sqlalchemy import sql, schema, exc, util |
---|
| 245 | from sqlalchemy import Table, MetaData, Column, ForeignKey, String, Integer |
---|
| 246 | from sqlalchemy.sql import select, compiler, expression, operators as sql_operators, functions as sql_functions |
---|
| 247 | from sqlalchemy.engine import default, base |
---|
| 248 | from sqlalchemy import types as sqltypes |
---|
| 249 | from decimal import Decimal as _python_Decimal |
---|
| 250 | |
---|
| 251 | |
---|
| 252 | RESERVED_WORDS = set( |
---|
| 253 | ['add', 'all', 'alter', 'and', 'any', 'as', 'asc', 'authorization', |
---|
| 254 | 'backup', 'begin', 'between', 'break', 'browse', 'bulk', 'by', 'cascade', |
---|
| 255 | 'case', 'check', 'checkpoint', 'close', 'clustered', 'coalesce', |
---|
| 256 | 'collate', 'column', 'commit', 'compute', 'constraint', 'contains', |
---|
| 257 | 'containstable', 'continue', 'convert', 'create', 'cross', 'current', |
---|
| 258 | 'current_date', 'current_time', 'current_timestamp', 'current_user', |
---|
| 259 | 'cursor', 'database', 'dbcc', 'deallocate', 'declare', 'default', |
---|
| 260 | 'delete', 'deny', 'desc', 'disk', 'distinct', 'distributed', 'double', |
---|
| 261 | 'drop', 'dump', 'else', 'end', 'errlvl', 'escape', 'except', 'exec', |
---|
| 262 | 'execute', 'exists', 'exit', 'external', 'fetch', 'file', 'fillfactor', |
---|
| 263 | 'for', 'foreign', 'freetext', 'freetexttable', 'from', 'full', |
---|
| 264 | 'function', 'goto', 'grant', 'group', 'having', 'holdlock', 'identity', |
---|
| 265 | 'identity_insert', 'identitycol', 'if', 'in', 'index', 'inner', 'insert', |
---|
| 266 | 'intersect', 'into', 'is', 'join', 'key', 'kill', 'left', 'like', |
---|
| 267 | 'lineno', 'load', 'merge', 'national', 'nocheck', 'nonclustered', 'not', |
---|
| 268 | 'null', 'nullif', 'of', 'off', 'offsets', 'on', 'open', 'opendatasource', |
---|
| 269 | 'openquery', 'openrowset', 'openxml', 'option', 'or', 'order', 'outer', |
---|
| 270 | 'over', 'percent', 'pivot', 'plan', 'precision', 'primary', 'print', |
---|
| 271 | 'proc', 'procedure', 'public', 'raiserror', 'read', 'readtext', |
---|
| 272 | 'reconfigure', 'references', 'replication', 'restore', 'restrict', |
---|
| 273 | 'return', 'revert', 'revoke', 'right', 'rollback', 'rowcount', |
---|
| 274 | 'rowguidcol', 'rule', 'save', 'schema', 'securityaudit', 'select', |
---|
| 275 | 'session_user', 'set', 'setuser', 'shutdown', 'some', 'statistics', |
---|
| 276 | 'system_user', 'table', 'tablesample', 'textsize', 'then', 'to', 'top', |
---|
| 277 | 'tran', 'transaction', 'trigger', 'truncate', 'tsequal', 'union', |
---|
| 278 | 'unique', 'unpivot', 'update', 'updatetext', 'use', 'user', 'values', |
---|
| 279 | 'varying', 'view', 'waitfor', 'when', 'where', 'while', 'with', |
---|
| 280 | 'writetext', |
---|
| 281 | ]) |
---|
| 282 | |
---|
| 283 | |
---|
| 284 | class _StringType(object): |
---|
| 285 | """Base for MSSQL string types.""" |
---|
| 286 | |
---|
| 287 | def __init__(self, collation=None, **kwargs): |
---|
| 288 | self.collation = kwargs.get('collate', collation) |
---|
| 289 | |
---|
| 290 | def _extend(self, spec): |
---|
| 291 | """Extend a string-type declaration with standard SQL |
---|
| 292 | COLLATE annotations. |
---|
| 293 | """ |
---|
| 294 | |
---|
| 295 | if self.collation: |
---|
| 296 | collation = 'COLLATE %s' % self.collation |
---|
| 297 | else: |
---|
| 298 | collation = None |
---|
| 299 | |
---|
| 300 | return ' '.join([c for c in (spec, collation) |
---|
| 301 | if c is not None]) |
---|
| 302 | |
---|
| 303 | def __repr__(self): |
---|
| 304 | attributes = inspect.getargspec(self.__init__)[0][1:] |
---|
| 305 | attributes.extend(inspect.getargspec(_StringType.__init__)[0][1:]) |
---|
| 306 | |
---|
| 307 | params = {} |
---|
| 308 | for attr in attributes: |
---|
| 309 | val = getattr(self, attr) |
---|
| 310 | if val is not None and val is not False: |
---|
| 311 | params[attr] = val |
---|
| 312 | |
---|
| 313 | return "%s(%s)" % (self.__class__.__name__, |
---|
| 314 | ', '.join(['%s=%r' % (k, params[k]) for k in params])) |
---|
| 315 | |
---|
| 316 | def bind_processor(self, dialect): |
---|
| 317 | if self.convert_unicode or dialect.convert_unicode: |
---|
| 318 | if self.assert_unicode is None: |
---|
| 319 | assert_unicode = dialect.assert_unicode |
---|
| 320 | else: |
---|
| 321 | assert_unicode = self.assert_unicode |
---|
| 322 | |
---|
| 323 | if not assert_unicode: |
---|
| 324 | return None |
---|
| 325 | |
---|
| 326 | def process(value): |
---|
| 327 | if not isinstance(value, (unicode, sqltypes.NoneType)): |
---|
| 328 | if assert_unicode == 'warn': |
---|
| 329 | util.warn("Unicode type received non-unicode bind " |
---|
| 330 | "param value %r" % value) |
---|
| 331 | return value |
---|
| 332 | else: |
---|
| 333 | raise exc.InvalidRequestError("Unicode type received non-unicode bind param value %r" % value) |
---|
| 334 | else: |
---|
| 335 | return value |
---|
| 336 | return process |
---|
| 337 | else: |
---|
| 338 | return None |
---|
| 339 | |
---|
| 340 | |
---|
| 341 | class MSNumeric(sqltypes.Numeric): |
---|
| 342 | def result_processor(self, dialect): |
---|
| 343 | if self.asdecimal: |
---|
| 344 | def process(value): |
---|
| 345 | if value is not None: |
---|
| 346 | return _python_Decimal(str(value)) |
---|
| 347 | else: |
---|
| 348 | return value |
---|
| 349 | return process |
---|
| 350 | else: |
---|
| 351 | def process(value): |
---|
| 352 | return float(value) |
---|
| 353 | return process |
---|
| 354 | |
---|
| 355 | def bind_processor(self, dialect): |
---|
| 356 | def process(value): |
---|
| 357 | if value is None: |
---|
| 358 | # Not sure that this exception is needed |
---|
| 359 | return value |
---|
| 360 | |
---|
| 361 | elif isinstance(value, decimal.Decimal): |
---|
| 362 | if value.adjusted() < 0: |
---|
| 363 | result = "%s0.%s%s" % ( |
---|
| 364 | (value < 0 and '-' or ''), |
---|
| 365 | '0' * (abs(value.adjusted()) - 1), |
---|
| 366 | "".join([str(nint) for nint in value._int])) |
---|
| 367 | |
---|
| 368 | else: |
---|
| 369 | if 'E' in str(value): |
---|
| 370 | result = "%s%s%s" % ( |
---|
| 371 | (value < 0 and '-' or ''), |
---|
| 372 | "".join([str(s) for s in value._int]), |
---|
| 373 | "0" * (value.adjusted() - (len(value._int)-1))) |
---|
| 374 | else: |
---|
| 375 | if (len(value._int) - 1) > value.adjusted(): |
---|
| 376 | result = "%s%s.%s" % ( |
---|
| 377 | (value < 0 and '-' or ''), |
---|
| 378 | "".join([str(s) for s in value._int][0:value.adjusted() + 1]), |
---|
| 379 | "".join([str(s) for s in value._int][value.adjusted() + 1:])) |
---|
| 380 | else: |
---|
| 381 | result = "%s%s" % ( |
---|
| 382 | (value < 0 and '-' or ''), |
---|
| 383 | "".join([str(s) for s in value._int][0:value.adjusted() + 1])) |
---|
| 384 | |
---|
| 385 | return result |
---|
| 386 | |
---|
| 387 | else: |
---|
| 388 | return value |
---|
| 389 | |
---|
| 390 | return process |
---|
| 391 | |
---|
| 392 | def get_col_spec(self): |
---|
| 393 | if self.precision is None: |
---|
| 394 | return "NUMERIC" |
---|
| 395 | else: |
---|
| 396 | return "NUMERIC(%(precision)s, %(scale)s)" % {'precision': self.precision, 'scale' : self.scale} |
---|
| 397 | |
---|
| 398 | |
---|
| 399 | class MSFloat(sqltypes.Float): |
---|
| 400 | def get_col_spec(self): |
---|
| 401 | if self.precision is None: |
---|
| 402 | return "FLOAT" |
---|
| 403 | else: |
---|
| 404 | return "FLOAT(%(precision)s)" % {'precision': self.precision} |
---|
| 405 | |
---|
| 406 | |
---|
| 407 | class MSReal(MSFloat): |
---|
| 408 | """A type for ``real`` numbers.""" |
---|
| 409 | |
---|
| 410 | def __init__(self): |
---|
| 411 | """ |
---|
| 412 | Construct a Real. |
---|
| 413 | |
---|
| 414 | """ |
---|
| 415 | super(MSReal, self).__init__(precision=24) |
---|
| 416 | |
---|
| 417 | def adapt(self, impltype): |
---|
| 418 | return impltype() |
---|
| 419 | |
---|
| 420 | def get_col_spec(self): |
---|
| 421 | return "REAL" |
---|
| 422 | |
---|
| 423 | |
---|
| 424 | class MSInteger(sqltypes.Integer): |
---|
| 425 | def get_col_spec(self): |
---|
| 426 | return "INTEGER" |
---|
| 427 | |
---|
| 428 | |
---|
| 429 | class MSBigInteger(MSInteger): |
---|
| 430 | def get_col_spec(self): |
---|
| 431 | return "BIGINT" |
---|
| 432 | |
---|
| 433 | |
---|
| 434 | class MSTinyInteger(MSInteger): |
---|
| 435 | def get_col_spec(self): |
---|
| 436 | return "TINYINT" |
---|
| 437 | |
---|
| 438 | |
---|
| 439 | class MSSmallInteger(MSInteger): |
---|
| 440 | def get_col_spec(self): |
---|
| 441 | return "SMALLINT" |
---|
| 442 | |
---|
| 443 | |
---|
| 444 | class _DateTimeType(object): |
---|
| 445 | """Base for MSSQL datetime types.""" |
---|
| 446 | |
---|
| 447 | def bind_processor(self, dialect): |
---|
| 448 | # if we receive just a date we can manipulate it |
---|
| 449 | # into a datetime since the db-api may not do this. |
---|
| 450 | def process(value): |
---|
| 451 | if type(value) is datetime.date: |
---|
| 452 | return datetime.datetime(value.year, value.month, value.day) |
---|
| 453 | return value |
---|
| 454 | return process |
---|
| 455 | |
---|
| 456 | |
---|
| 457 | class MSDateTime(_DateTimeType, sqltypes.DateTime): |
---|
| 458 | def get_col_spec(self): |
---|
| 459 | return "DATETIME" |
---|
| 460 | |
---|
| 461 | |
---|
| 462 | class MSDate(sqltypes.Date): |
---|
| 463 | def get_col_spec(self): |
---|
| 464 | return "DATE" |
---|
| 465 | |
---|
| 466 | |
---|
| 467 | class MSTime(sqltypes.Time): |
---|
| 468 | def __init__(self, precision=None, **kwargs): |
---|
| 469 | self.precision = precision |
---|
| 470 | super(MSTime, self).__init__() |
---|
| 471 | |
---|
| 472 | def get_col_spec(self): |
---|
| 473 | if self.precision: |
---|
| 474 | return "TIME(%s)" % self.precision |
---|
| 475 | else: |
---|
| 476 | return "TIME" |
---|
| 477 | |
---|
| 478 | |
---|
| 479 | class MSSmallDateTime(_DateTimeType, sqltypes.TypeEngine): |
---|
| 480 | def get_col_spec(self): |
---|
| 481 | return "SMALLDATETIME" |
---|
| 482 | |
---|
| 483 | |
---|
| 484 | class MSDateTime2(_DateTimeType, sqltypes.TypeEngine): |
---|
| 485 | def __init__(self, precision=None, **kwargs): |
---|
| 486 | self.precision = precision |
---|
| 487 | |
---|
| 488 | def get_col_spec(self): |
---|
| 489 | if self.precision: |
---|
| 490 | return "DATETIME2(%s)" % self.precision |
---|
| 491 | else: |
---|
| 492 | return "DATETIME2" |
---|
| 493 | |
---|
| 494 | |
---|
| 495 | class MSDateTimeOffset(_DateTimeType, sqltypes.TypeEngine): |
---|
| 496 | def __init__(self, precision=None, **kwargs): |
---|
| 497 | self.precision = precision |
---|
| 498 | |
---|
| 499 | def get_col_spec(self): |
---|
| 500 | if self.precision: |
---|
| 501 | return "DATETIMEOFFSET(%s)" % self.precision |
---|
| 502 | else: |
---|
| 503 | return "DATETIMEOFFSET" |
---|
| 504 | |
---|
| 505 | |
---|
| 506 | class MSDateTimeAsDate(_DateTimeType, MSDate): |
---|
| 507 | """ This is an implementation of the Date type for versions of MSSQL that |
---|
| 508 | do not support that specific type. In order to make it work a ``DATETIME`` |
---|
| 509 | column specification is used and the results get converted back to just |
---|
| 510 | the date portion. |
---|
| 511 | |
---|
| 512 | """ |
---|
| 513 | |
---|
| 514 | def get_col_spec(self): |
---|
| 515 | return "DATETIME" |
---|
| 516 | |
---|
| 517 | def result_processor(self, dialect): |
---|
| 518 | def process(value): |
---|
| 519 | # If the DBAPI returns the value as datetime.datetime(), truncate |
---|
| 520 | # it back to datetime.date() |
---|
| 521 | if type(value) is datetime.datetime: |
---|
| 522 | return value.date() |
---|
| 523 | return value |
---|
| 524 | return process |
---|
| 525 | |
---|
| 526 | |
---|
| 527 | class MSDateTimeAsTime(MSTime): |
---|
| 528 | """ This is an implementation of the Time type for versions of MSSQL that |
---|
| 529 | do not support that specific type. In order to make it work a ``DATETIME`` |
---|
| 530 | column specification is used and the results get converted back to just |
---|
| 531 | the time portion. |
---|
| 532 | |
---|
| 533 | """ |
---|
| 534 | |
---|
| 535 | __zero_date = datetime.date(1900, 1, 1) |
---|
| 536 | |
---|
| 537 | def get_col_spec(self): |
---|
| 538 | return "DATETIME" |
---|
| 539 | |
---|
| 540 | def bind_processor(self, dialect): |
---|
| 541 | def process(value): |
---|
| 542 | if type(value) is datetime.datetime: |
---|
| 543 | value = datetime.datetime.combine(self.__zero_date, value.time()) |
---|
| 544 | elif type(value) is datetime.time: |
---|
| 545 | value = datetime.datetime.combine(self.__zero_date, value) |
---|
| 546 | return value |
---|
| 547 | return process |
---|
| 548 | |
---|
| 549 | def result_processor(self, dialect): |
---|
| 550 | def process(value): |
---|
| 551 | if type(value) is datetime.datetime: |
---|
| 552 | return value.time() |
---|
| 553 | elif type(value) is datetime.date: |
---|
| 554 | return datetime.time(0, 0, 0) |
---|
| 555 | return value |
---|
| 556 | return process |
---|
| 557 | |
---|
| 558 | |
---|
| 559 | class MSDateTime_adodbapi(MSDateTime): |
---|
| 560 | def result_processor(self, dialect): |
---|
| 561 | def process(value): |
---|
| 562 | # adodbapi will return datetimes with empty time values as datetime.date() objects. |
---|
| 563 | # Promote them back to full datetime.datetime() |
---|
| 564 | if type(value) is datetime.date: |
---|
| 565 | return datetime.datetime(value.year, value.month, value.day) |
---|
| 566 | return value |
---|
| 567 | return process |
---|
| 568 | |
---|
| 569 | |
---|
| 570 | class MSText(_StringType, sqltypes.Text): |
---|
| 571 | """MSSQL TEXT type, for variable-length text up to 2^31 characters.""" |
---|
| 572 | |
---|
| 573 | def __init__(self, *args, **kwargs): |
---|
| 574 | """Construct a TEXT. |
---|
| 575 | |
---|
| 576 | :param collation: Optional, a column-level collation for this string |
---|
| 577 | value. Accepts a Windows Collation Name or a SQL Collation Name. |
---|
| 578 | |
---|
| 579 | """ |
---|
| 580 | _StringType.__init__(self, **kwargs) |
---|
| 581 | sqltypes.Text.__init__(self, None, |
---|
| 582 | convert_unicode=kwargs.get('convert_unicode', False), |
---|
| 583 | assert_unicode=kwargs.get('assert_unicode', None)) |
---|
| 584 | |
---|
| 585 | def get_col_spec(self): |
---|
| 586 | if self.dialect.text_as_varchar: |
---|
| 587 | return self._extend("VARCHAR(max)") |
---|
| 588 | else: |
---|
| 589 | return self._extend("TEXT") |
---|
| 590 | |
---|
| 591 | |
---|
| 592 | class MSNText(_StringType, sqltypes.UnicodeText): |
---|
| 593 | """MSSQL NTEXT type, for variable-length unicode text up to 2^30 |
---|
| 594 | characters.""" |
---|
| 595 | |
---|
| 596 | def __init__(self, *args, **kwargs): |
---|
| 597 | """Construct a NTEXT. |
---|
| 598 | |
---|
| 599 | :param collation: Optional, a column-level collation for this string |
---|
| 600 | value. Accepts a Windows Collation Name or a SQL Collation Name. |
---|
| 601 | |
---|
| 602 | """ |
---|
| 603 | _StringType.__init__(self, **kwargs) |
---|
| 604 | sqltypes.UnicodeText.__init__(self, None, |
---|
| 605 | convert_unicode=kwargs.get('convert_unicode', True), |
---|
| 606 | assert_unicode=kwargs.get('assert_unicode', 'warn')) |
---|
| 607 | |
---|
| 608 | def get_col_spec(self): |
---|
| 609 | if self.dialect.text_as_varchar: |
---|
| 610 | return self._extend("NVARCHAR(max)") |
---|
| 611 | else: |
---|
| 612 | return self._extend("NTEXT") |
---|
| 613 | |
---|
| 614 | |
---|
| 615 | class MSString(_StringType, sqltypes.String): |
---|
| 616 | """MSSQL VARCHAR type, for variable-length non-Unicode data with a maximum |
---|
| 617 | of 8,000 characters.""" |
---|
| 618 | |
---|
| 619 | def __init__(self, length=None, convert_unicode=False, assert_unicode=None, **kwargs): |
---|
| 620 | """Construct a VARCHAR. |
---|
| 621 | |
---|
| 622 | :param length: Optinal, maximum data length, in characters. |
---|
| 623 | |
---|
| 624 | :param convert_unicode: defaults to False. If True, convert |
---|
| 625 | ``unicode`` data sent to the database to a ``str`` |
---|
| 626 | bytestring, and convert bytestrings coming back from the |
---|
| 627 | database into ``unicode``. |
---|
| 628 | |
---|
| 629 | Bytestrings are encoded using the dialect's |
---|
| 630 | :attr:`~sqlalchemy.engine.base.Dialect.encoding`, which |
---|
| 631 | defaults to `utf-8`. |
---|
| 632 | |
---|
| 633 | If False, may be overridden by |
---|
| 634 | :attr:`sqlalchemy.engine.base.Dialect.convert_unicode`. |
---|
| 635 | |
---|
| 636 | :param assert_unicode: |
---|
| 637 | |
---|
| 638 | If None (the default), no assertion will take place unless |
---|
| 639 | overridden by :attr:`sqlalchemy.engine.base.Dialect.assert_unicode`. |
---|
| 640 | |
---|
| 641 | If 'warn', will issue a runtime warning if a ``str`` |
---|
| 642 | instance is used as a bind value. |
---|
| 643 | |
---|
| 644 | If true, will raise an :exc:`sqlalchemy.exc.InvalidRequestError`. |
---|
| 645 | |
---|
| 646 | :param collation: Optional, a column-level collation for this string |
---|
| 647 | value. Accepts a Windows Collation Name or a SQL Collation Name. |
---|
| 648 | |
---|
| 649 | """ |
---|
| 650 | _StringType.__init__(self, **kwargs) |
---|
| 651 | sqltypes.String.__init__(self, length=length, |
---|
| 652 | convert_unicode=convert_unicode, |
---|
| 653 | assert_unicode=assert_unicode) |
---|
| 654 | |
---|
| 655 | def get_col_spec(self): |
---|
| 656 | if self.length: |
---|
| 657 | return self._extend("VARCHAR(%s)" % self.length) |
---|
| 658 | else: |
---|
| 659 | return self._extend("VARCHAR") |
---|
| 660 | |
---|
| 661 | |
---|
| 662 | class MSNVarchar(_StringType, sqltypes.Unicode): |
---|
| 663 | """MSSQL NVARCHAR type. |
---|
| 664 | |
---|
| 665 | For variable-length unicode character data up to 4,000 characters.""" |
---|
| 666 | |
---|
| 667 | def __init__(self, length=None, **kwargs): |
---|
| 668 | """Construct a NVARCHAR. |
---|
| 669 | |
---|
| 670 | :param length: Optional, Maximum data length, in characters. |
---|
| 671 | |
---|
| 672 | :param collation: Optional, a column-level collation for this string |
---|
| 673 | value. Accepts a Windows Collation Name or a SQL Collation Name. |
---|
| 674 | |
---|
| 675 | """ |
---|
| 676 | _StringType.__init__(self, **kwargs) |
---|
| 677 | sqltypes.Unicode.__init__(self, length=length, |
---|
| 678 | convert_unicode=kwargs.get('convert_unicode', True), |
---|
| 679 | assert_unicode=kwargs.get('assert_unicode', 'warn')) |
---|
| 680 | |
---|
| 681 | def adapt(self, impltype): |
---|
| 682 | return impltype(length=self.length, |
---|
| 683 | convert_unicode=self.convert_unicode, |
---|
| 684 | assert_unicode=self.assert_unicode, |
---|
| 685 | collation=self.collation) |
---|
| 686 | |
---|
| 687 | def get_col_spec(self): |
---|
| 688 | if self.length: |
---|
| 689 | return self._extend("NVARCHAR(%(length)s)" % {'length' : self.length}) |
---|
| 690 | else: |
---|
| 691 | return self._extend("NVARCHAR") |
---|
| 692 | |
---|
| 693 | |
---|
| 694 | class MSChar(_StringType, sqltypes.CHAR): |
---|
| 695 | """MSSQL CHAR type, for fixed-length non-Unicode data with a maximum |
---|
| 696 | of 8,000 characters.""" |
---|
| 697 | |
---|
| 698 | def __init__(self, length=None, convert_unicode=False, assert_unicode=None, **kwargs): |
---|
| 699 | """Construct a CHAR. |
---|
| 700 | |
---|
| 701 | :param length: Optinal, maximum data length, in characters. |
---|
| 702 | |
---|
| 703 | :param convert_unicode: defaults to False. If True, convert |
---|
| 704 | ``unicode`` data sent to the database to a ``str`` |
---|
| 705 | bytestring, and convert bytestrings coming back from the |
---|
| 706 | database into ``unicode``. |
---|
| 707 | |
---|
| 708 | Bytestrings are encoded using the dialect's |
---|
| 709 | :attr:`~sqlalchemy.engine.base.Dialect.encoding`, which |
---|
| 710 | defaults to `utf-8`. |
---|
| 711 | |
---|
| 712 | If False, may be overridden by |
---|
| 713 | :attr:`sqlalchemy.engine.base.Dialect.convert_unicode`. |
---|
| 714 | |
---|
| 715 | :param assert_unicode: |
---|
| 716 | |
---|
| 717 | If None (the default), no assertion will take place unless |
---|
| 718 | overridden by :attr:`sqlalchemy.engine.base.Dialect.assert_unicode`. |
---|
| 719 | |
---|
| 720 | If 'warn', will issue a runtime warning if a ``str`` |
---|
| 721 | instance is used as a bind value. |
---|
| 722 | |
---|
| 723 | If true, will raise an :exc:`sqlalchemy.exc.InvalidRequestError`. |
---|
| 724 | |
---|
| 725 | :param collation: Optional, a column-level collation for this string |
---|
| 726 | value. Accepts a Windows Collation Name or a SQL Collation Name. |
---|
| 727 | |
---|
| 728 | """ |
---|
| 729 | _StringType.__init__(self, **kwargs) |
---|
| 730 | sqltypes.CHAR.__init__(self, length=length, |
---|
| 731 | convert_unicode=convert_unicode, |
---|
| 732 | assert_unicode=assert_unicode) |
---|
| 733 | |
---|
| 734 | def get_col_spec(self): |
---|
| 735 | if self.length: |
---|
| 736 | return self._extend("CHAR(%s)" % self.length) |
---|
| 737 | else: |
---|
| 738 | return self._extend("CHAR") |
---|
| 739 | |
---|
| 740 | |
---|
| 741 | class MSNChar(_StringType, sqltypes.NCHAR): |
---|
| 742 | """MSSQL NCHAR type. |
---|
| 743 | |
---|
| 744 | For fixed-length unicode character data up to 4,000 characters.""" |
---|
| 745 | |
---|
| 746 | def __init__(self, length=None, **kwargs): |
---|
| 747 | """Construct an NCHAR. |
---|
| 748 | |
---|
| 749 | :param length: Optional, Maximum data length, in characters. |
---|
| 750 | |
---|
| 751 | :param collation: Optional, a column-level collation for this string |
---|
| 752 | value. Accepts a Windows Collation Name or a SQL Collation Name. |
---|
| 753 | |
---|
| 754 | """ |
---|
| 755 | _StringType.__init__(self, **kwargs) |
---|
| 756 | sqltypes.NCHAR.__init__(self, length=length, |
---|
| 757 | convert_unicode=kwargs.get('convert_unicode', True), |
---|
| 758 | assert_unicode=kwargs.get('assert_unicode', 'warn')) |
---|
| 759 | |
---|
| 760 | def get_col_spec(self): |
---|
| 761 | if self.length: |
---|
| 762 | return self._extend("NCHAR(%(length)s)" % {'length' : self.length}) |
---|
| 763 | else: |
---|
| 764 | return self._extend("NCHAR") |
---|
| 765 | |
---|
| 766 | |
---|
| 767 | class MSGenericBinary(sqltypes.Binary): |
---|
| 768 | """The Binary type assumes that a Binary specification without a length |
---|
| 769 | is an unbound Binary type whereas one with a length specification results |
---|
| 770 | in a fixed length Binary type. |
---|
| 771 | |
---|
| 772 | If you want standard MSSQL ``BINARY`` behavior use the ``MSBinary`` type. |
---|
| 773 | |
---|
| 774 | """ |
---|
| 775 | |
---|
| 776 | def get_col_spec(self): |
---|
| 777 | if self.length: |
---|
| 778 | return "BINARY(%s)" % self.length |
---|
| 779 | else: |
---|
| 780 | return "IMAGE" |
---|
| 781 | |
---|
| 782 | |
---|
| 783 | class MSBinary(MSGenericBinary): |
---|
| 784 | def get_col_spec(self): |
---|
| 785 | if self.length: |
---|
| 786 | return "BINARY(%s)" % self.length |
---|
| 787 | else: |
---|
| 788 | return "BINARY" |
---|
| 789 | |
---|
| 790 | |
---|
| 791 | class MSVarBinary(MSGenericBinary): |
---|
| 792 | def get_col_spec(self): |
---|
| 793 | if self.length: |
---|
| 794 | return "VARBINARY(%s)" % self.length |
---|
| 795 | else: |
---|
| 796 | return "VARBINARY" |
---|
| 797 | |
---|
| 798 | |
---|
| 799 | class MSImage(MSGenericBinary): |
---|
| 800 | def get_col_spec(self): |
---|
| 801 | return "IMAGE" |
---|
| 802 | |
---|
| 803 | |
---|
| 804 | class MSBoolean(sqltypes.Boolean): |
---|
| 805 | def get_col_spec(self): |
---|
| 806 | return "BIT" |
---|
| 807 | |
---|
| 808 | def result_processor(self, dialect): |
---|
| 809 | def process(value): |
---|
| 810 | if value is None: |
---|
| 811 | return None |
---|
| 812 | return value and True or False |
---|
| 813 | return process |
---|
| 814 | |
---|
| 815 | def bind_processor(self, dialect): |
---|
| 816 | def process(value): |
---|
| 817 | if value is True: |
---|
| 818 | return 1 |
---|
| 819 | elif value is False: |
---|
| 820 | return 0 |
---|
| 821 | elif value is None: |
---|
| 822 | return None |
---|
| 823 | else: |
---|
| 824 | return value and True or False |
---|
| 825 | return process |
---|
| 826 | |
---|
| 827 | |
---|
| 828 | class MSTimeStamp(sqltypes.TIMESTAMP): |
---|
| 829 | def get_col_spec(self): |
---|
| 830 | return "TIMESTAMP" |
---|
| 831 | |
---|
| 832 | |
---|
| 833 | class MSMoney(sqltypes.TypeEngine): |
---|
| 834 | def get_col_spec(self): |
---|
| 835 | return "MONEY" |
---|
| 836 | |
---|
| 837 | |
---|
| 838 | class MSSmallMoney(MSMoney): |
---|
| 839 | def get_col_spec(self): |
---|
| 840 | return "SMALLMONEY" |
---|
| 841 | |
---|
| 842 | |
---|
| 843 | class MSUniqueIdentifier(sqltypes.TypeEngine): |
---|
| 844 | def get_col_spec(self): |
---|
| 845 | return "UNIQUEIDENTIFIER" |
---|
| 846 | |
---|
| 847 | |
---|
| 848 | class MSVariant(sqltypes.TypeEngine): |
---|
| 849 | def get_col_spec(self): |
---|
| 850 | return "SQL_VARIANT" |
---|
| 851 | |
---|
| 852 | ischema = MetaData() |
---|
| 853 | |
---|
| 854 | schemata = Table("SCHEMATA", ischema, |
---|
| 855 | Column("CATALOG_NAME", String, key="catalog_name"), |
---|
| 856 | Column("SCHEMA_NAME", String, key="schema_name"), |
---|
| 857 | Column("SCHEMA_OWNER", String, key="schema_owner"), |
---|
| 858 | schema="INFORMATION_SCHEMA") |
---|
| 859 | |
---|
| 860 | tables = Table("TABLES", ischema, |
---|
| 861 | Column("TABLE_CATALOG", String, key="table_catalog"), |
---|
| 862 | Column("TABLE_SCHEMA", String, key="table_schema"), |
---|
| 863 | Column("TABLE_NAME", String, key="table_name"), |
---|
| 864 | Column("TABLE_TYPE", String, key="table_type"), |
---|
| 865 | schema="INFORMATION_SCHEMA") |
---|
| 866 | |
---|
| 867 | columns = Table("COLUMNS", ischema, |
---|
| 868 | Column("TABLE_SCHEMA", String, key="table_schema"), |
---|
| 869 | Column("TABLE_NAME", String, key="table_name"), |
---|
| 870 | Column("COLUMN_NAME", String, key="column_name"), |
---|
| 871 | Column("IS_NULLABLE", Integer, key="is_nullable"), |
---|
| 872 | Column("DATA_TYPE", String, key="data_type"), |
---|
| 873 | Column("ORDINAL_POSITION", Integer, key="ordinal_position"), |
---|
| 874 | Column("CHARACTER_MAXIMUM_LENGTH", Integer, key="character_maximum_length"), |
---|
| 875 | Column("NUMERIC_PRECISION", Integer, key="numeric_precision"), |
---|
| 876 | Column("NUMERIC_SCALE", Integer, key="numeric_scale"), |
---|
| 877 | Column("COLUMN_DEFAULT", Integer, key="column_default"), |
---|
| 878 | Column("COLLATION_NAME", String, key="collation_name"), |
---|
| 879 | schema="INFORMATION_SCHEMA") |
---|
| 880 | |
---|
| 881 | constraints = Table("TABLE_CONSTRAINTS", ischema, |
---|
| 882 | Column("TABLE_SCHEMA", String, key="table_schema"), |
---|
| 883 | Column("TABLE_NAME", String, key="table_name"), |
---|
| 884 | Column("CONSTRAINT_NAME", String, key="constraint_name"), |
---|
| 885 | Column("CONSTRAINT_TYPE", String, key="constraint_type"), |
---|
| 886 | schema="INFORMATION_SCHEMA") |
---|
| 887 | |
---|
| 888 | column_constraints = Table("CONSTRAINT_COLUMN_USAGE", ischema, |
---|
| 889 | Column("TABLE_SCHEMA", String, key="table_schema"), |
---|
| 890 | Column("TABLE_NAME", String, key="table_name"), |
---|
| 891 | Column("COLUMN_NAME", String, key="column_name"), |
---|
| 892 | Column("CONSTRAINT_NAME", String, key="constraint_name"), |
---|
| 893 | schema="INFORMATION_SCHEMA") |
---|
| 894 | |
---|
| 895 | key_constraints = Table("KEY_COLUMN_USAGE", ischema, |
---|
| 896 | Column("TABLE_SCHEMA", String, key="table_schema"), |
---|
| 897 | Column("TABLE_NAME", String, key="table_name"), |
---|
| 898 | Column("COLUMN_NAME", String, key="column_name"), |
---|
| 899 | Column("CONSTRAINT_NAME", String, key="constraint_name"), |
---|
| 900 | Column("ORDINAL_POSITION", Integer, key="ordinal_position"), |
---|
| 901 | schema="INFORMATION_SCHEMA") |
---|
| 902 | |
---|
| 903 | ref_constraints = Table("REFERENTIAL_CONSTRAINTS", ischema, |
---|
| 904 | Column("CONSTRAINT_CATALOG", String, key="constraint_catalog"), |
---|
| 905 | Column("CONSTRAINT_SCHEMA", String, key="constraint_schema"), |
---|
| 906 | Column("CONSTRAINT_NAME", String, key="constraint_name"), |
---|
| 907 | Column("UNIQUE_CONSTRAINT_CATLOG", String, key="unique_constraint_catalog"), |
---|
| 908 | Column("UNIQUE_CONSTRAINT_SCHEMA", String, key="unique_constraint_schema"), |
---|
| 909 | Column("UNIQUE_CONSTRAINT_NAME", String, key="unique_constraint_name"), |
---|
| 910 | Column("MATCH_OPTION", String, key="match_option"), |
---|
| 911 | Column("UPDATE_RULE", String, key="update_rule"), |
---|
| 912 | Column("DELETE_RULE", String, key="delete_rule"), |
---|
| 913 | schema="INFORMATION_SCHEMA") |
---|
| 914 | |
---|
| 915 | def _has_implicit_sequence(column): |
---|
| 916 | return column.primary_key and \ |
---|
| 917 | column.autoincrement and \ |
---|
| 918 | isinstance(column.type, sqltypes.Integer) and \ |
---|
| 919 | not column.foreign_keys and \ |
---|
| 920 | ( |
---|
| 921 | column.default is None or |
---|
| 922 | ( |
---|
| 923 | isinstance(column.default, schema.Sequence) and |
---|
| 924 | column.default.optional) |
---|
| 925 | ) |
---|
| 926 | |
---|
| 927 | def _table_sequence_column(tbl): |
---|
| 928 | if not hasattr(tbl, '_ms_has_sequence'): |
---|
| 929 | tbl._ms_has_sequence = None |
---|
| 930 | for column in tbl.c: |
---|
| 931 | if getattr(column, 'sequence', False) or _has_implicit_sequence(column): |
---|
| 932 | tbl._ms_has_sequence = column |
---|
| 933 | break |
---|
| 934 | return tbl._ms_has_sequence |
---|
| 935 | |
---|
| 936 | class MSSQLExecutionContext(default.DefaultExecutionContext): |
---|
| 937 | IINSERT = False |
---|
| 938 | HASIDENT = False |
---|
| 939 | |
---|
| 940 | def pre_exec(self): |
---|
| 941 | """Activate IDENTITY_INSERT if needed.""" |
---|
| 942 | |
---|
| 943 | if self.compiled.isinsert: |
---|
| 944 | tbl = self.compiled.statement.table |
---|
| 945 | seq_column = _table_sequence_column(tbl) |
---|
| 946 | self.HASIDENT = bool(seq_column) |
---|
| 947 | if self.dialect.auto_identity_insert and self.HASIDENT: |
---|
| 948 | self.IINSERT = tbl._ms_has_sequence.key in self.compiled_parameters[0] |
---|
| 949 | else: |
---|
| 950 | self.IINSERT = False |
---|
| 951 | |
---|
| 952 | if self.IINSERT: |
---|
| 953 | self.cursor.execute("SET IDENTITY_INSERT %s ON" % |
---|
| 954 | self.dialect.identifier_preparer.format_table(self.compiled.statement.table)) |
---|
| 955 | |
---|
| 956 | def handle_dbapi_exception(self, e): |
---|
| 957 | if self.IINSERT: |
---|
| 958 | try: |
---|
| 959 | self.cursor.execute("SET IDENTITY_INSERT %s OFF" % self.dialect.identifier_preparer.format_table(self.compiled.statement.table)) |
---|
| 960 | except: |
---|
| 961 | pass |
---|
| 962 | |
---|
| 963 | def post_exec(self): |
---|
| 964 | """Disable IDENTITY_INSERT if enabled.""" |
---|
| 965 | |
---|
| 966 | if self.compiled.isinsert and not self.executemany and self.HASIDENT and not self.IINSERT: |
---|
| 967 | if not self._last_inserted_ids or self._last_inserted_ids[0] is None: |
---|
| 968 | if self.dialect.use_scope_identity: |
---|
| 969 | self.cursor.execute("SELECT scope_identity() AS lastrowid") |
---|
| 970 | else: |
---|
| 971 | self.cursor.execute("SELECT @@identity AS lastrowid") |
---|
| 972 | row = self.cursor.fetchone() |
---|
| 973 | self._last_inserted_ids = [int(row[0])] + self._last_inserted_ids[1:] |
---|
| 974 | |
---|
| 975 | if self.IINSERT: |
---|
| 976 | self.cursor.execute("SET IDENTITY_INSERT %s OFF" % self.dialect.identifier_preparer.format_table(self.compiled.statement.table)) |
---|
| 977 | |
---|
| 978 | |
---|
| 979 | class MSSQLExecutionContext_pyodbc (MSSQLExecutionContext): |
---|
| 980 | def pre_exec(self): |
---|
| 981 | """where appropriate, issue "select scope_identity()" in the same statement""" |
---|
| 982 | super(MSSQLExecutionContext_pyodbc, self).pre_exec() |
---|
| 983 | if self.compiled.isinsert and self.HASIDENT and not self.IINSERT \ |
---|
| 984 | and len(self.parameters) == 1 and self.dialect.use_scope_identity: |
---|
| 985 | self.statement += "; select scope_identity()" |
---|
| 986 | |
---|
| 987 | def post_exec(self): |
---|
| 988 | if self.HASIDENT and not self.IINSERT and self.dialect.use_scope_identity and not self.executemany: |
---|
| 989 | import pyodbc |
---|
| 990 | # Fetch the last inserted id from the manipulated statement |
---|
| 991 | # We may have to skip over a number of result sets with no data (due to triggers, etc.) |
---|
| 992 | while True: |
---|
| 993 | try: |
---|
| 994 | row = self.cursor.fetchone() |
---|
| 995 | break |
---|
| 996 | except pyodbc.Error, e: |
---|
| 997 | self.cursor.nextset() |
---|
| 998 | self._last_inserted_ids = [int(row[0])] |
---|
| 999 | else: |
---|
| 1000 | super(MSSQLExecutionContext_pyodbc, self).post_exec() |
---|
| 1001 | |
---|
| 1002 | class MSSQLDialect(default.DefaultDialect): |
---|
| 1003 | name = 'mssql' |
---|
| 1004 | supports_default_values = True |
---|
| 1005 | supports_empty_insert = False |
---|
| 1006 | auto_identity_insert = True |
---|
| 1007 | execution_ctx_cls = MSSQLExecutionContext |
---|
| 1008 | text_as_varchar = False |
---|
| 1009 | use_scope_identity = False |
---|
| 1010 | has_window_funcs = False |
---|
| 1011 | max_identifier_length = 128 |
---|
| 1012 | schema_name = "dbo" |
---|
| 1013 | |
---|
| 1014 | colspecs = { |
---|
| 1015 | sqltypes.Unicode : MSNVarchar, |
---|
| 1016 | sqltypes.Integer : MSInteger, |
---|
| 1017 | sqltypes.Smallinteger: MSSmallInteger, |
---|
| 1018 | sqltypes.Numeric : MSNumeric, |
---|
| 1019 | sqltypes.Float : MSFloat, |
---|
| 1020 | sqltypes.DateTime : MSDateTime, |
---|
| 1021 | sqltypes.Date : MSDate, |
---|
| 1022 | sqltypes.Time : MSTime, |
---|
| 1023 | sqltypes.String : MSString, |
---|
| 1024 | sqltypes.Binary : MSGenericBinary, |
---|
| 1025 | sqltypes.Boolean : MSBoolean, |
---|
| 1026 | sqltypes.Text : MSText, |
---|
| 1027 | sqltypes.UnicodeText : MSNText, |
---|
| 1028 | sqltypes.CHAR: MSChar, |
---|
| 1029 | sqltypes.NCHAR: MSNChar, |
---|
| 1030 | sqltypes.TIMESTAMP: MSTimeStamp, |
---|
| 1031 | } |
---|
| 1032 | |
---|
| 1033 | ischema_names = { |
---|
| 1034 | 'int' : MSInteger, |
---|
| 1035 | 'bigint': MSBigInteger, |
---|
| 1036 | 'smallint' : MSSmallInteger, |
---|
| 1037 | 'tinyint' : MSTinyInteger, |
---|
| 1038 | 'varchar' : MSString, |
---|
| 1039 | 'nvarchar' : MSNVarchar, |
---|
| 1040 | 'char' : MSChar, |
---|
| 1041 | 'nchar' : MSNChar, |
---|
| 1042 | 'text' : MSText, |
---|
| 1043 | 'ntext' : MSNText, |
---|
| 1044 | 'decimal' : MSNumeric, |
---|
| 1045 | 'numeric' : MSNumeric, |
---|
| 1046 | 'float' : MSFloat, |
---|
| 1047 | 'datetime' : MSDateTime, |
---|
| 1048 | 'datetime2' : MSDateTime2, |
---|
| 1049 | 'datetimeoffset' : MSDateTimeOffset, |
---|
| 1050 | 'date': MSDate, |
---|
| 1051 | 'time': MSTime, |
---|
| 1052 | 'smalldatetime' : MSSmallDateTime, |
---|
| 1053 | 'binary' : MSBinary, |
---|
| 1054 | 'varbinary' : MSVarBinary, |
---|
| 1055 | 'bit': MSBoolean, |
---|
| 1056 | 'real' : MSFloat, |
---|
| 1057 | 'image' : MSImage, |
---|
| 1058 | 'timestamp': MSTimeStamp, |
---|
| 1059 | 'money': MSMoney, |
---|
| 1060 | 'smallmoney': MSSmallMoney, |
---|
| 1061 | 'uniqueidentifier': MSUniqueIdentifier, |
---|
| 1062 | 'sql_variant': MSVariant, |
---|
| 1063 | } |
---|
| 1064 | |
---|
| 1065 | def __new__(cls, *args, **kwargs): |
---|
| 1066 | if cls is not MSSQLDialect: |
---|
| 1067 | # this gets called with the dialect specific class |
---|
| 1068 | return super(MSSQLDialect, cls).__new__(cls) |
---|
| 1069 | dbapi = kwargs.get('dbapi', None) |
---|
| 1070 | if dbapi: |
---|
| 1071 | dialect = dialect_mapping.get(dbapi.__name__) |
---|
| 1072 | return dialect(**kwargs) |
---|
| 1073 | else: |
---|
| 1074 | return object.__new__(cls) |
---|
| 1075 | |
---|
| 1076 | def __init__(self, |
---|
| 1077 | auto_identity_insert=True, query_timeout=None, |
---|
| 1078 | text_as_varchar=False, use_scope_identity=False, |
---|
| 1079 | has_window_funcs=False, max_identifier_length=None, |
---|
| 1080 | schema_name="dbo", **opts): |
---|
| 1081 | self.auto_identity_insert = bool(auto_identity_insert) |
---|
| 1082 | self.query_timeout = int(query_timeout or 0) |
---|
| 1083 | self.schema_name = schema_name |
---|
| 1084 | |
---|
| 1085 | # to-do: the options below should use server version introspection to set themselves on connection |
---|
| 1086 | self.text_as_varchar = bool(text_as_varchar) |
---|
| 1087 | self.use_scope_identity = bool(use_scope_identity) |
---|
| 1088 | self.has_window_funcs = bool(has_window_funcs) |
---|
| 1089 | self.max_identifier_length = int(max_identifier_length or 0) or \ |
---|
| 1090 | self.max_identifier_length |
---|
| 1091 | super(MSSQLDialect, self).__init__(**opts) |
---|
| 1092 | |
---|
| 1093 | @classmethod |
---|
| 1094 | def dbapi(cls, module_name=None): |
---|
| 1095 | if module_name: |
---|
| 1096 | try: |
---|
| 1097 | dialect_cls = dialect_mapping[module_name] |
---|
| 1098 | return dialect_cls.import_dbapi() |
---|
| 1099 | except KeyError: |
---|
| 1100 | raise exc.InvalidRequestError("Unsupported MSSQL module '%s' requested (must be adodbpi, pymssql or pyodbc)" % module_name) |
---|
| 1101 | else: |
---|
| 1102 | for dialect_cls in [MSSQLDialect_pyodbc, MSSQLDialect_pymssql, MSSQLDialect_adodbapi]: |
---|
| 1103 | try: |
---|
| 1104 | return dialect_cls.import_dbapi() |
---|
| 1105 | except ImportError, e: |
---|
| 1106 | pass |
---|
| 1107 | else: |
---|
| 1108 | raise ImportError('No DBAPI module detected for MSSQL - please install pyodbc, pymssql, or adodbapi') |
---|
| 1109 | |
---|
| 1110 | @base.connection_memoize(('mssql', 'server_version_info')) |
---|
| 1111 | def server_version_info(self, connection): |
---|
| 1112 | """A tuple of the database server version. |
---|
| 1113 | |
---|
| 1114 | Formats the remote server version as a tuple of version values, |
---|
| 1115 | e.g. ``(9, 0, 1399)``. If there are strings in the version number |
---|
| 1116 | they will be in the tuple too, so don't count on these all being |
---|
| 1117 | ``int`` values. |
---|
| 1118 | |
---|
| 1119 | This is a fast check that does not require a round trip. It is also |
---|
| 1120 | cached per-Connection. |
---|
| 1121 | """ |
---|
| 1122 | return connection.dialect._server_version_info(connection.connection) |
---|
| 1123 | |
---|
| 1124 | def _server_version_info(self, dbapi_con): |
---|
| 1125 | """Return a tuple of the database's version number.""" |
---|
| 1126 | raise NotImplementedError() |
---|
| 1127 | |
---|
| 1128 | def create_connect_args(self, url): |
---|
| 1129 | opts = url.translate_connect_args(username='user') |
---|
| 1130 | opts.update(url.query) |
---|
| 1131 | if 'auto_identity_insert' in opts: |
---|
| 1132 | self.auto_identity_insert = bool(int(opts.pop('auto_identity_insert'))) |
---|
| 1133 | if 'query_timeout' in opts: |
---|
| 1134 | self.query_timeout = int(opts.pop('query_timeout')) |
---|
| 1135 | if 'text_as_varchar' in opts: |
---|
| 1136 | self.text_as_varchar = bool(int(opts.pop('text_as_varchar'))) |
---|
| 1137 | if 'use_scope_identity' in opts: |
---|
| 1138 | self.use_scope_identity = bool(int(opts.pop('use_scope_identity'))) |
---|
| 1139 | if 'has_window_funcs' in opts: |
---|
| 1140 | self.has_window_funcs = bool(int(opts.pop('has_window_funcs'))) |
---|
| 1141 | return self.make_connect_string(opts, url.query) |
---|
| 1142 | |
---|
| 1143 | def type_descriptor(self, typeobj): |
---|
| 1144 | newobj = sqltypes.adapt_type(typeobj, self.colspecs) |
---|
| 1145 | # Some types need to know about the dialect |
---|
| 1146 | if isinstance(newobj, (MSText, MSNText)): |
---|
| 1147 | newobj.dialect = self |
---|
| 1148 | return newobj |
---|
| 1149 | |
---|
| 1150 | def do_savepoint(self, connection, name): |
---|
| 1151 | util.warn("Savepoint support in mssql is experimental and may lead to data loss.") |
---|
| 1152 | connection.execute("IF @@TRANCOUNT = 0 BEGIN TRANSACTION") |
---|
| 1153 | connection.execute("SAVE TRANSACTION %s" % name) |
---|
| 1154 | |
---|
| 1155 | def do_release_savepoint(self, connection, name): |
---|
| 1156 | pass |
---|
| 1157 | |
---|
| 1158 | @base.connection_memoize(('dialect', 'default_schema_name')) |
---|
| 1159 | def get_default_schema_name(self, connection): |
---|
| 1160 | query = "SELECT user_name() as user_name;" |
---|
| 1161 | user_name = connection.scalar(sql.text(query)) |
---|
| 1162 | if user_name is not None: |
---|
| 1163 | # now, get the default schema |
---|
| 1164 | query = """ |
---|
| 1165 | SELECT default_schema_name FROM |
---|
| 1166 | sys.database_principals |
---|
| 1167 | WHERE name = :user_name |
---|
| 1168 | AND type = 'S' |
---|
| 1169 | """ |
---|
| 1170 | try: |
---|
| 1171 | default_schema_name = connection.scalar(sql.text(query), |
---|
| 1172 | user_name=user_name) |
---|
| 1173 | if default_schema_name is not None: |
---|
| 1174 | return default_schema_name |
---|
| 1175 | except: |
---|
| 1176 | pass |
---|
| 1177 | return self.schema_name |
---|
| 1178 | |
---|
| 1179 | def table_names(self, connection, schema): |
---|
| 1180 | s = select([tables.c.table_name], tables.c.table_schema==schema) |
---|
| 1181 | return [row[0] for row in connection.execute(s)] |
---|
| 1182 | |
---|
| 1183 | |
---|
| 1184 | def has_table(self, connection, tablename, schema=None): |
---|
| 1185 | |
---|
| 1186 | current_schema = schema or self.get_default_schema_name(connection) |
---|
| 1187 | s = sql.select([columns], |
---|
| 1188 | current_schema |
---|
| 1189 | and sql.and_(columns.c.table_name==tablename, columns.c.table_schema==current_schema) |
---|
| 1190 | or columns.c.table_name==tablename, |
---|
| 1191 | ) |
---|
| 1192 | |
---|
| 1193 | c = connection.execute(s) |
---|
| 1194 | row = c.fetchone() |
---|
| 1195 | return row is not None |
---|
| 1196 | |
---|
| 1197 | def reflecttable(self, connection, table, include_columns): |
---|
| 1198 | # Get base columns |
---|
| 1199 | if table.schema is not None: |
---|
| 1200 | current_schema = table.schema |
---|
| 1201 | else: |
---|
| 1202 | current_schema = self.get_default_schema_name(connection) |
---|
| 1203 | |
---|
| 1204 | s = sql.select([columns], |
---|
| 1205 | current_schema |
---|
| 1206 | and sql.and_(columns.c.table_name==table.name, columns.c.table_schema==current_schema) |
---|
| 1207 | or columns.c.table_name==table.name, |
---|
| 1208 | order_by=[columns.c.ordinal_position]) |
---|
| 1209 | |
---|
| 1210 | c = connection.execute(s) |
---|
| 1211 | found_table = False |
---|
| 1212 | while True: |
---|
| 1213 | row = c.fetchone() |
---|
| 1214 | if row is None: |
---|
| 1215 | break |
---|
| 1216 | found_table = True |
---|
| 1217 | (name, type, nullable, charlen, numericprec, numericscale, default, collation) = ( |
---|
| 1218 | row[columns.c.column_name], |
---|
| 1219 | row[columns.c.data_type], |
---|
| 1220 | row[columns.c.is_nullable] == 'YES', |
---|
| 1221 | row[columns.c.character_maximum_length], |
---|
| 1222 | row[columns.c.numeric_precision], |
---|
| 1223 | row[columns.c.numeric_scale], |
---|
| 1224 | row[columns.c.column_default], |
---|
| 1225 | row[columns.c.collation_name] |
---|
| 1226 | ) |
---|
| 1227 | if include_columns and name not in include_columns: |
---|
| 1228 | continue |
---|
| 1229 | |
---|
| 1230 | coltype = self.ischema_names.get(type, None) |
---|
| 1231 | |
---|
| 1232 | kwargs = {} |
---|
| 1233 | if coltype in (MSString, MSChar, MSNVarchar, MSNChar, MSText, MSNText, MSBinary, MSVarBinary, sqltypes.Binary): |
---|
| 1234 | kwargs['length'] = charlen |
---|
| 1235 | if collation: |
---|
| 1236 | kwargs['collation'] = collation |
---|
| 1237 | if coltype == MSText or (coltype in (MSString, MSNVarchar) and charlen == -1): |
---|
| 1238 | kwargs.pop('length') |
---|
| 1239 | |
---|
| 1240 | if issubclass(coltype, sqltypes.Numeric): |
---|
| 1241 | kwargs['scale'] = numericscale |
---|
| 1242 | kwargs['precision'] = numericprec |
---|
| 1243 | |
---|
| 1244 | if coltype is None: |
---|
| 1245 | util.warn("Did not recognize type '%s' of column '%s'" % (type, name)) |
---|
| 1246 | coltype = sqltypes.NULLTYPE |
---|
| 1247 | |
---|
| 1248 | coltype = coltype(**kwargs) |
---|
| 1249 | colargs = [] |
---|
| 1250 | if default is not None: |
---|
| 1251 | colargs.append(schema.DefaultClause(sql.text(default))) |
---|
| 1252 | table.append_column(schema.Column(name, coltype, nullable=nullable, autoincrement=False, *colargs)) |
---|
| 1253 | |
---|
| 1254 | if not found_table: |
---|
| 1255 | raise exc.NoSuchTableError(table.name) |
---|
| 1256 | |
---|
| 1257 | # We also run an sp_columns to check for identity columns: |
---|
| 1258 | cursor = connection.execute("sp_columns @table_name = '%s', @table_owner = '%s'" % (table.name, current_schema)) |
---|
| 1259 | ic = None |
---|
| 1260 | while True: |
---|
| 1261 | row = cursor.fetchone() |
---|
| 1262 | if row is None: |
---|
| 1263 | break |
---|
| 1264 | col_name, type_name = row[3], row[5] |
---|
| 1265 | if type_name.endswith("identity") and col_name in table.c: |
---|
| 1266 | ic = table.c[col_name] |
---|
| 1267 | ic.autoincrement = True |
---|
| 1268 | # setup a psuedo-sequence to represent the identity attribute - we interpret this at table.create() time as the identity attribute |
---|
| 1269 | ic.sequence = schema.Sequence(ic.name + '_identity', 1, 1) |
---|
| 1270 | # MSSQL: only one identity per table allowed |
---|
| 1271 | cursor.close() |
---|
| 1272 | break |
---|
| 1273 | if not ic is None: |
---|
| 1274 | try: |
---|
| 1275 | cursor = connection.execute("select ident_seed(?), ident_incr(?)", table.fullname, table.fullname) |
---|
| 1276 | row = cursor.fetchone() |
---|
| 1277 | cursor.close() |
---|
| 1278 | if not row is None: |
---|
| 1279 | ic.sequence.start = int(row[0]) |
---|
| 1280 | ic.sequence.increment = int(row[1]) |
---|
| 1281 | except: |
---|
| 1282 | # ignoring it, works just like before |
---|
| 1283 | pass |
---|
| 1284 | |
---|
| 1285 | # Add constraints |
---|
| 1286 | RR = ref_constraints |
---|
| 1287 | TC = constraints |
---|
| 1288 | C = key_constraints.alias('C') #information_schema.constraint_column_usage: the constrained column |
---|
| 1289 | R = key_constraints.alias('R') #information_schema.constraint_column_usage: the referenced column |
---|
| 1290 | |
---|
| 1291 | # Primary key constraints |
---|
| 1292 | s = sql.select([C.c.column_name, TC.c.constraint_type], sql.and_(TC.c.constraint_name == C.c.constraint_name, |
---|
| 1293 | C.c.table_name == table.name, |
---|
| 1294 | C.c.table_schema == (table.schema or current_schema))) |
---|
| 1295 | c = connection.execute(s) |
---|
| 1296 | for row in c: |
---|
| 1297 | if 'PRIMARY' in row[TC.c.constraint_type.name] and row[0] in table.c: |
---|
| 1298 | table.primary_key.add(table.c[row[0]]) |
---|
| 1299 | |
---|
| 1300 | # Foreign key constraints |
---|
| 1301 | s = sql.select([C.c.column_name, |
---|
| 1302 | R.c.table_schema, R.c.table_name, R.c.column_name, |
---|
| 1303 | RR.c.constraint_name, RR.c.match_option, RR.c.update_rule, RR.c.delete_rule], |
---|
| 1304 | sql.and_(C.c.table_name == table.name, |
---|
| 1305 | C.c.table_schema == (table.schema or current_schema), |
---|
| 1306 | C.c.constraint_name == RR.c.constraint_name, |
---|
| 1307 | R.c.constraint_name == RR.c.unique_constraint_name, |
---|
| 1308 | C.c.ordinal_position == R.c.ordinal_position |
---|
| 1309 | ), |
---|
| 1310 | order_by = [RR.c.constraint_name, R.c.ordinal_position]) |
---|
| 1311 | rows = connection.execute(s).fetchall() |
---|
| 1312 | |
---|
| 1313 | def _gen_fkref(table, rschema, rtbl, rcol): |
---|
| 1314 | if rschema == current_schema and not table.schema: |
---|
| 1315 | return '.'.join([rtbl, rcol]) |
---|
| 1316 | else: |
---|
| 1317 | return '.'.join([rschema, rtbl, rcol]) |
---|
| 1318 | |
---|
| 1319 | # group rows by constraint ID, to handle multi-column FKs |
---|
| 1320 | fknm, scols, rcols = (None, [], []) |
---|
| 1321 | for r in rows: |
---|
| 1322 | scol, rschema, rtbl, rcol, rfknm, fkmatch, fkuprule, fkdelrule = r |
---|
| 1323 | # if the reflected schema is the default schema then don't set it because this will |
---|
| 1324 | # play into the metadata key causing duplicates. |
---|
| 1325 | if rschema == current_schema and not table.schema: |
---|
| 1326 | schema.Table(rtbl, table.metadata, autoload=True, autoload_with=connection) |
---|
| 1327 | else: |
---|
| 1328 | schema.Table(rtbl, table.metadata, schema=rschema, autoload=True, autoload_with=connection) |
---|
| 1329 | if rfknm != fknm: |
---|
| 1330 | if fknm: |
---|
| 1331 | table.append_constraint(schema.ForeignKeyConstraint(scols, [_gen_fkref(table, s, t, c) for s, t, c in rcols], fknm, link_to_name=True)) |
---|
| 1332 | fknm, scols, rcols = (rfknm, [], []) |
---|
| 1333 | if not scol in scols: |
---|
| 1334 | scols.append(scol) |
---|
| 1335 | if not (rschema, rtbl, rcol) in rcols: |
---|
| 1336 | rcols.append((rschema, rtbl, rcol)) |
---|
| 1337 | |
---|
| 1338 | if fknm and scols: |
---|
| 1339 | table.append_constraint(schema.ForeignKeyConstraint(scols, [_gen_fkref(table, s, t, c) for s, t, c in rcols], fknm, link_to_name=True)) |
---|
| 1340 | |
---|
| 1341 | |
---|
| 1342 | class MSSQLDialect_pymssql(MSSQLDialect): |
---|
| 1343 | supports_sane_rowcount = False |
---|
| 1344 | max_identifier_length = 30 |
---|
| 1345 | |
---|
| 1346 | @classmethod |
---|
| 1347 | def import_dbapi(cls): |
---|
| 1348 | import pymssql as module |
---|
| 1349 | # pymmsql doesn't have a Binary method. we use string |
---|
| 1350 | # TODO: monkeypatching here is less than ideal |
---|
| 1351 | module.Binary = lambda st: str(st) |
---|
| 1352 | try: |
---|
| 1353 | module.version_info = tuple(map(int, module.__version__.split('.'))) |
---|
| 1354 | except: |
---|
| 1355 | module.version_info = (0, 0, 0) |
---|
| 1356 | return module |
---|
| 1357 | |
---|
| 1358 | def __init__(self, **params): |
---|
| 1359 | super(MSSQLDialect_pymssql, self).__init__(**params) |
---|
| 1360 | self.use_scope_identity = True |
---|
| 1361 | |
---|
| 1362 | # pymssql understands only ascii |
---|
| 1363 | if self.convert_unicode: |
---|
| 1364 | util.warn("pymssql does not support unicode") |
---|
| 1365 | self.encoding = params.get('encoding', 'ascii') |
---|
| 1366 | |
---|
| 1367 | self.colspecs = MSSQLDialect.colspecs.copy() |
---|
| 1368 | self.ischema_names = MSSQLDialect.ischema_names.copy() |
---|
| 1369 | self.ischema_names['date'] = MSDateTimeAsDate |
---|
| 1370 | self.colspecs[sqltypes.Date] = MSDateTimeAsDate |
---|
| 1371 | self.ischema_names['time'] = MSDateTimeAsTime |
---|
| 1372 | self.colspecs[sqltypes.Time] = MSDateTimeAsTime |
---|
| 1373 | |
---|
| 1374 | def create_connect_args(self, url): |
---|
| 1375 | r = super(MSSQLDialect_pymssql, self).create_connect_args(url) |
---|
| 1376 | if hasattr(self, 'query_timeout'): |
---|
| 1377 | if self.dbapi.version_info > (0, 8, 0): |
---|
| 1378 | r[1]['timeout'] = self.query_timeout |
---|
| 1379 | else: |
---|
| 1380 | self.dbapi._mssql.set_query_timeout(self.query_timeout) |
---|
| 1381 | return r |
---|
| 1382 | |
---|
| 1383 | def make_connect_string(self, keys, query): |
---|
| 1384 | if keys.get('port'): |
---|
| 1385 | # pymssql expects port as host:port, not a separate arg |
---|
| 1386 | keys['host'] = ''.join([keys.get('host', ''), ':', str(keys['port'])]) |
---|
| 1387 | del keys['port'] |
---|
| 1388 | return [[], keys] |
---|
| 1389 | |
---|
| 1390 | def is_disconnect(self, e): |
---|
| 1391 | return isinstance(e, self.dbapi.DatabaseError) and "Error 10054" in str(e) |
---|
| 1392 | |
---|
| 1393 | def do_begin(self, connection): |
---|
| 1394 | pass |
---|
| 1395 | |
---|
| 1396 | |
---|
| 1397 | class MSSQLDialect_pyodbc(MSSQLDialect): |
---|
| 1398 | supports_sane_rowcount = False |
---|
| 1399 | supports_sane_multi_rowcount = False |
---|
| 1400 | # PyODBC unicode is broken on UCS-4 builds |
---|
| 1401 | supports_unicode = sys.maxunicode == 65535 |
---|
| 1402 | supports_unicode_statements = supports_unicode |
---|
| 1403 | execution_ctx_cls = MSSQLExecutionContext_pyodbc |
---|
| 1404 | |
---|
| 1405 | def __init__(self, description_encoding='latin-1', **params): |
---|
| 1406 | super(MSSQLDialect_pyodbc, self).__init__(**params) |
---|
| 1407 | self.description_encoding = description_encoding |
---|
| 1408 | |
---|
| 1409 | if self.server_version_info < (10,): |
---|
| 1410 | self.colspecs = MSSQLDialect.colspecs.copy() |
---|
| 1411 | self.ischema_names = MSSQLDialect.ischema_names.copy() |
---|
| 1412 | self.ischema_names['date'] = MSDateTimeAsDate |
---|
| 1413 | self.colspecs[sqltypes.Date] = MSDateTimeAsDate |
---|
| 1414 | self.ischema_names['time'] = MSDateTimeAsTime |
---|
| 1415 | self.colspecs[sqltypes.Time] = MSDateTimeAsTime |
---|
| 1416 | |
---|
| 1417 | # FIXME: scope_identity sniff should look at server version, not the ODBC driver |
---|
| 1418 | # whether use_scope_identity will work depends on the version of pyodbc |
---|
| 1419 | try: |
---|
| 1420 | import pyodbc |
---|
| 1421 | self.use_scope_identity = hasattr(pyodbc.Cursor, 'nextset') |
---|
| 1422 | except: |
---|
| 1423 | pass |
---|
| 1424 | |
---|
| 1425 | @classmethod |
---|
| 1426 | def import_dbapi(cls): |
---|
| 1427 | import pyodbc as module |
---|
| 1428 | return module |
---|
| 1429 | |
---|
| 1430 | def make_connect_string(self, keys, query): |
---|
| 1431 | if 'max_identifier_length' in keys: |
---|
| 1432 | self.max_identifier_length = int(keys.pop('max_identifier_length')) |
---|
| 1433 | |
---|
| 1434 | if 'odbc_connect' in keys: |
---|
| 1435 | connectors = [urllib.unquote_plus(keys.pop('odbc_connect'))] |
---|
| 1436 | else: |
---|
| 1437 | dsn_connection = 'dsn' in keys or ('host' in keys and 'database' not in keys) |
---|
| 1438 | if dsn_connection: |
---|
| 1439 | connectors= ['dsn=%s' % (keys.pop('host', '') or keys.pop('dsn', ''))] |
---|
| 1440 | else: |
---|
| 1441 | port = '' |
---|
| 1442 | if 'port' in keys and not 'port' in query: |
---|
| 1443 | port = ',%d' % int(keys.pop('port')) |
---|
| 1444 | |
---|
| 1445 | connectors = ["DRIVER={%s}" % keys.pop('driver', 'SQL Server'), |
---|
| 1446 | 'Server=%s%s' % (keys.pop('host', ''), port), |
---|
| 1447 | 'Database=%s' % keys.pop('database', '') ] |
---|
| 1448 | |
---|
| 1449 | user = keys.pop("user", None) |
---|
| 1450 | if user: |
---|
| 1451 | connectors.append("UID=%s" % user) |
---|
| 1452 | connectors.append("PWD=%s" % keys.pop('password', '')) |
---|
| 1453 | else: |
---|
| 1454 | connectors.append("Trusted_Connection=Yes") |
---|
| 1455 | |
---|
| 1456 | # if set to 'Yes', the ODBC layer will try to automagically convert |
---|
| 1457 | # textual data from your database encoding to your client encoding |
---|
| 1458 | # This should obviously be set to 'No' if you query a cp1253 encoded |
---|
| 1459 | # database from a latin1 client... |
---|
| 1460 | if 'odbc_autotranslate' in keys: |
---|
| 1461 | connectors.append("AutoTranslate=%s" % keys.pop("odbc_autotranslate")) |
---|
| 1462 | |
---|
| 1463 | connectors.extend(['%s=%s' % (k,v) for k,v in keys.iteritems()]) |
---|
| 1464 | |
---|
| 1465 | return [[";".join (connectors)], {}] |
---|
| 1466 | |
---|
| 1467 | def is_disconnect(self, e): |
---|
| 1468 | if isinstance(e, self.dbapi.ProgrammingError): |
---|
| 1469 | return "The cursor's connection has been closed." in str(e) or 'Attempt to use a closed connection.' in str(e) |
---|
| 1470 | elif isinstance(e, self.dbapi.Error): |
---|
| 1471 | return '[08S01]' in str(e) |
---|
| 1472 | else: |
---|
| 1473 | return False |
---|
| 1474 | |
---|
| 1475 | |
---|
| 1476 | def _server_version_info(self, dbapi_con): |
---|
| 1477 | """Convert a pyodbc SQL_DBMS_VER string into a tuple.""" |
---|
| 1478 | version = [] |
---|
| 1479 | r = re.compile('[.\-]') |
---|
| 1480 | for n in r.split(dbapi_con.getinfo(self.dbapi.SQL_DBMS_VER)): |
---|
| 1481 | try: |
---|
| 1482 | version.append(int(n)) |
---|
| 1483 | except ValueError: |
---|
| 1484 | version.append(n) |
---|
| 1485 | return tuple(version) |
---|
| 1486 | |
---|
| 1487 | class MSSQLDialect_adodbapi(MSSQLDialect): |
---|
| 1488 | supports_sane_rowcount = True |
---|
| 1489 | supports_sane_multi_rowcount = True |
---|
| 1490 | supports_unicode = sys.maxunicode == 65535 |
---|
| 1491 | supports_unicode_statements = True |
---|
| 1492 | |
---|
| 1493 | @classmethod |
---|
| 1494 | def import_dbapi(cls): |
---|
| 1495 | import adodbapi as module |
---|
| 1496 | return module |
---|
| 1497 | |
---|
| 1498 | colspecs = MSSQLDialect.colspecs.copy() |
---|
| 1499 | colspecs[sqltypes.DateTime] = MSDateTime_adodbapi |
---|
| 1500 | |
---|
| 1501 | ischema_names = MSSQLDialect.ischema_names.copy() |
---|
| 1502 | ischema_names['datetime'] = MSDateTime_adodbapi |
---|
| 1503 | |
---|
| 1504 | def make_connect_string(self, keys, query): |
---|
| 1505 | connectors = ["Provider=SQLOLEDB"] |
---|
| 1506 | if 'port' in keys: |
---|
| 1507 | connectors.append ("Data Source=%s, %s" % (keys.get("host"), keys.get("port"))) |
---|
| 1508 | else: |
---|
| 1509 | connectors.append ("Data Source=%s" % keys.get("host")) |
---|
| 1510 | connectors.append ("Initial Catalog=%s" % keys.get("database")) |
---|
| 1511 | user = keys.get("user") |
---|
| 1512 | if user: |
---|
| 1513 | connectors.append("User Id=%s" % user) |
---|
| 1514 | connectors.append("Password=%s" % keys.get("password", "")) |
---|
| 1515 | else: |
---|
| 1516 | connectors.append("Integrated Security=SSPI") |
---|
| 1517 | return [[";".join (connectors)], {}] |
---|
| 1518 | |
---|
| 1519 | def is_disconnect(self, e): |
---|
| 1520 | return isinstance(e, self.dbapi.adodbapi.DatabaseError) and "'connection failure'" in str(e) |
---|
| 1521 | |
---|
| 1522 | |
---|
| 1523 | dialect_mapping = { |
---|
| 1524 | 'pymssql': MSSQLDialect_pymssql, |
---|
| 1525 | 'pyodbc': MSSQLDialect_pyodbc, |
---|
| 1526 | 'adodbapi': MSSQLDialect_adodbapi |
---|
| 1527 | } |
---|
| 1528 | |
---|
| 1529 | |
---|
| 1530 | class MSSQLCompiler(compiler.DefaultCompiler): |
---|
| 1531 | operators = compiler.OPERATORS.copy() |
---|
| 1532 | operators.update({ |
---|
| 1533 | sql_operators.concat_op: '+', |
---|
| 1534 | sql_operators.match_op: lambda x, y: "CONTAINS (%s, %s)" % (x, y) |
---|
| 1535 | }) |
---|
| 1536 | |
---|
| 1537 | functions = compiler.DefaultCompiler.functions.copy() |
---|
| 1538 | functions.update ( |
---|
| 1539 | { |
---|
| 1540 | sql_functions.now: 'CURRENT_TIMESTAMP', |
---|
| 1541 | sql_functions.current_date: 'GETDATE()', |
---|
| 1542 | 'length': lambda x: "LEN(%s)" % x, |
---|
| 1543 | sql_functions.char_length: lambda x: "LEN(%s)" % x |
---|
| 1544 | } |
---|
| 1545 | ) |
---|
| 1546 | |
---|
| 1547 | extract_map = compiler.DefaultCompiler.extract_map.copy() |
---|
| 1548 | extract_map.update ({ |
---|
| 1549 | 'doy': 'dayofyear', |
---|
| 1550 | 'dow': 'weekday', |
---|
| 1551 | 'milliseconds': 'millisecond', |
---|
| 1552 | 'microseconds': 'microsecond' |
---|
| 1553 | }) |
---|
| 1554 | |
---|
| 1555 | def __init__(self, *args, **kwargs): |
---|
| 1556 | super(MSSQLCompiler, self).__init__(*args, **kwargs) |
---|
| 1557 | self.tablealiases = {} |
---|
| 1558 | |
---|
| 1559 | def get_select_precolumns(self, select): |
---|
| 1560 | """ MS-SQL puts TOP, it's version of LIMIT here """ |
---|
| 1561 | if select._distinct or select._limit: |
---|
| 1562 | s = select._distinct and "DISTINCT " or "" |
---|
| 1563 | |
---|
| 1564 | if select._limit: |
---|
| 1565 | if not select._offset: |
---|
| 1566 | s += "TOP %s " % (select._limit,) |
---|
| 1567 | else: |
---|
| 1568 | if not self.dialect.has_window_funcs: |
---|
| 1569 | raise exc.InvalidRequestError('MSSQL does not support LIMIT with an offset') |
---|
| 1570 | return s |
---|
| 1571 | return compiler.DefaultCompiler.get_select_precolumns(self, select) |
---|
| 1572 | |
---|
| 1573 | def limit_clause(self, select): |
---|
| 1574 | # Limit in mssql is after the select keyword |
---|
| 1575 | return "" |
---|
| 1576 | |
---|
| 1577 | def visit_select(self, select, **kwargs): |
---|
| 1578 | """Look for ``LIMIT`` and OFFSET in a select statement, and if |
---|
| 1579 | so tries to wrap it in a subquery with ``row_number()`` criterion. |
---|
| 1580 | |
---|
| 1581 | """ |
---|
| 1582 | if self.dialect.has_window_funcs and not getattr(select, '_mssql_visit', None) and select._offset: |
---|
| 1583 | # to use ROW_NUMBER(), an ORDER BY is required. |
---|
| 1584 | orderby = self.process(select._order_by_clause) |
---|
| 1585 | if not orderby: |
---|
| 1586 | raise exc.InvalidRequestError('MSSQL requires an order_by when using an offset.') |
---|
| 1587 | |
---|
| 1588 | _offset = select._offset |
---|
| 1589 | _limit = select._limit |
---|
| 1590 | select._mssql_visit = True |
---|
| 1591 | select = select.column(sql.literal_column("ROW_NUMBER() OVER (ORDER BY %s)" % orderby).label("mssql_rn")).order_by(None).alias() |
---|
| 1592 | |
---|
| 1593 | limitselect = sql.select([c for c in select.c if c.key!='mssql_rn']) |
---|
| 1594 | limitselect.append_whereclause("mssql_rn>%d" % _offset) |
---|
| 1595 | if _limit is not None: |
---|
| 1596 | limitselect.append_whereclause("mssql_rn<=%d" % (_limit + _offset)) |
---|
| 1597 | return self.process(limitselect, iswrapper=True, **kwargs) |
---|
| 1598 | else: |
---|
| 1599 | return compiler.DefaultCompiler.visit_select(self, select, **kwargs) |
---|
| 1600 | |
---|
| 1601 | def _schema_aliased_table(self, table): |
---|
| 1602 | if getattr(table, 'schema', None) is not None: |
---|
| 1603 | if table not in self.tablealiases: |
---|
| 1604 | self.tablealiases[table] = table.alias() |
---|
| 1605 | return self.tablealiases[table] |
---|
| 1606 | else: |
---|
| 1607 | return None |
---|
| 1608 | |
---|
| 1609 | def visit_table(self, table, mssql_aliased=False, **kwargs): |
---|
| 1610 | if mssql_aliased: |
---|
| 1611 | return super(MSSQLCompiler, self).visit_table(table, **kwargs) |
---|
| 1612 | |
---|
| 1613 | # alias schema-qualified tables |
---|
| 1614 | alias = self._schema_aliased_table(table) |
---|
| 1615 | if alias is not None: |
---|
| 1616 | return self.process(alias, mssql_aliased=True, **kwargs) |
---|
| 1617 | else: |
---|
| 1618 | return super(MSSQLCompiler, self).visit_table(table, **kwargs) |
---|
| 1619 | |
---|
| 1620 | def visit_alias(self, alias, **kwargs): |
---|
| 1621 | # translate for schema-qualified table aliases |
---|
| 1622 | self.tablealiases[alias.original] = alias |
---|
| 1623 | kwargs['mssql_aliased'] = True |
---|
| 1624 | return super(MSSQLCompiler, self).visit_alias(alias, **kwargs) |
---|
| 1625 | |
---|
| 1626 | def visit_extract(self, extract): |
---|
| 1627 | field = self.extract_map.get(extract.field, extract.field) |
---|
| 1628 | return 'DATEPART("%s", %s)' % (field, self.process(extract.expr)) |
---|
| 1629 | |
---|
| 1630 | def visit_rollback_to_savepoint(self, savepoint_stmt): |
---|
| 1631 | return "ROLLBACK TRANSACTION %s" % self.preparer.format_savepoint(savepoint_stmt) |
---|
| 1632 | |
---|
| 1633 | def visit_column(self, column, result_map=None, **kwargs): |
---|
| 1634 | if column.table is not None and \ |
---|
| 1635 | (not self.isupdate and not self.isdelete) or self.is_subquery(): |
---|
| 1636 | # translate for schema-qualified table aliases |
---|
| 1637 | t = self._schema_aliased_table(column.table) |
---|
| 1638 | if t is not None: |
---|
| 1639 | converted = expression._corresponding_column_or_error(t, column) |
---|
| 1640 | |
---|
| 1641 | if result_map is not None: |
---|
| 1642 | result_map[column.name.lower()] = (column.name, (column, ), column.type) |
---|
| 1643 | |
---|
| 1644 | return super(MSSQLCompiler, self).visit_column(converted, result_map=None, **kwargs) |
---|
| 1645 | |
---|
| 1646 | return super(MSSQLCompiler, self).visit_column(column, result_map=result_map, **kwargs) |
---|
| 1647 | |
---|
| 1648 | def visit_binary(self, binary, **kwargs): |
---|
| 1649 | """Move bind parameters to the right-hand side of an operator, where |
---|
| 1650 | possible. |
---|
| 1651 | |
---|
| 1652 | """ |
---|
| 1653 | if isinstance(binary.left, expression._BindParamClause) and binary.operator == operator.eq \ |
---|
| 1654 | and not isinstance(binary.right, expression._BindParamClause): |
---|
| 1655 | return self.process(expression._BinaryExpression(binary.right, binary.left, binary.operator), **kwargs) |
---|
| 1656 | else: |
---|
| 1657 | if (binary.operator is operator.eq or binary.operator is operator.ne) and ( |
---|
| 1658 | (isinstance(binary.left, expression._FromGrouping) and isinstance(binary.left.element, expression._ScalarSelect)) or \ |
---|
| 1659 | (isinstance(binary.right, expression._FromGrouping) and isinstance(binary.right.element, expression._ScalarSelect)) or \ |
---|
| 1660 | isinstance(binary.left, expression._ScalarSelect) or isinstance(binary.right, expression._ScalarSelect)): |
---|
| 1661 | op = binary.operator == operator.eq and "IN" or "NOT IN" |
---|
| 1662 | return self.process(expression._BinaryExpression(binary.left, binary.right, op), **kwargs) |
---|
| 1663 | return super(MSSQLCompiler, self).visit_binary(binary, **kwargs) |
---|
| 1664 | |
---|
| 1665 | def visit_insert(self, insert_stmt): |
---|
| 1666 | insert_select = False |
---|
| 1667 | if insert_stmt.parameters: |
---|
| 1668 | insert_select = [p for p in insert_stmt.parameters.values() if isinstance(p, sql.Select)] |
---|
| 1669 | if insert_select: |
---|
| 1670 | self.isinsert = True |
---|
| 1671 | colparams = self._get_colparams(insert_stmt) |
---|
| 1672 | preparer = self.preparer |
---|
| 1673 | |
---|
| 1674 | insert = ' '.join(["INSERT"] + |
---|
| 1675 | [self.process(x) for x in insert_stmt._prefixes]) |
---|
| 1676 | |
---|
| 1677 | if not colparams and not self.dialect.supports_default_values and not self.dialect.supports_empty_insert: |
---|
| 1678 | raise exc.CompileError( |
---|
| 1679 | "The version of %s you are using does not support empty inserts." % self.dialect.name) |
---|
| 1680 | elif not colparams and self.dialect.supports_default_values: |
---|
| 1681 | return (insert + " INTO %s DEFAULT VALUES" % ( |
---|
| 1682 | (preparer.format_table(insert_stmt.table),))) |
---|
| 1683 | else: |
---|
| 1684 | return (insert + " INTO %s (%s) SELECT %s" % |
---|
| 1685 | (preparer.format_table(insert_stmt.table), |
---|
| 1686 | ', '.join([preparer.format_column(c[0]) |
---|
| 1687 | for c in colparams]), |
---|
| 1688 | ', '.join([c[1] for c in colparams]))) |
---|
| 1689 | else: |
---|
| 1690 | return super(MSSQLCompiler, self).visit_insert(insert_stmt) |
---|
| 1691 | |
---|
| 1692 | def label_select_column(self, select, column, asfrom): |
---|
| 1693 | if isinstance(column, expression.Function): |
---|
| 1694 | return column.label(None) |
---|
| 1695 | else: |
---|
| 1696 | return super(MSSQLCompiler, self).label_select_column(select, column, asfrom) |
---|
| 1697 | |
---|
| 1698 | def for_update_clause(self, select): |
---|
| 1699 | # "FOR UPDATE" is only allowed on "DECLARE CURSOR" which SQLAlchemy doesn't use |
---|
| 1700 | return '' |
---|
| 1701 | |
---|
| 1702 | def order_by_clause(self, select): |
---|
| 1703 | order_by = self.process(select._order_by_clause) |
---|
| 1704 | |
---|
| 1705 | # MSSQL only allows ORDER BY in subqueries if there is a LIMIT |
---|
| 1706 | if order_by and (not self.is_subquery() or select._limit): |
---|
| 1707 | return " ORDER BY " + order_by |
---|
| 1708 | else: |
---|
| 1709 | return "" |
---|
| 1710 | |
---|
| 1711 | |
---|
| 1712 | class MSSQLSchemaGenerator(compiler.SchemaGenerator): |
---|
| 1713 | def get_column_specification(self, column, **kwargs): |
---|
| 1714 | colspec = self.preparer.format_column(column) + " " + column.type.dialect_impl(self.dialect).get_col_spec() |
---|
| 1715 | |
---|
| 1716 | if column.nullable is not None: |
---|
| 1717 | if not column.nullable or column.primary_key: |
---|
| 1718 | colspec += " NOT NULL" |
---|
| 1719 | else: |
---|
| 1720 | colspec += " NULL" |
---|
| 1721 | |
---|
| 1722 | if not column.table: |
---|
| 1723 | raise exc.InvalidRequestError("mssql requires Table-bound columns in order to generate DDL") |
---|
| 1724 | |
---|
| 1725 | seq_col = _table_sequence_column(column.table) |
---|
| 1726 | |
---|
| 1727 | # install a IDENTITY Sequence if we have an implicit IDENTITY column |
---|
| 1728 | if seq_col is column: |
---|
| 1729 | sequence = getattr(column, 'sequence', None) |
---|
| 1730 | if sequence: |
---|
| 1731 | start, increment = sequence.start or 1, sequence.increment or 1 |
---|
| 1732 | else: |
---|
| 1733 | start, increment = 1, 1 |
---|
| 1734 | colspec += " IDENTITY(%s,%s)" % (start, increment) |
---|
| 1735 | else: |
---|
| 1736 | default = self.get_column_default_string(column) |
---|
| 1737 | if default is not None: |
---|
| 1738 | colspec += " DEFAULT " + default |
---|
| 1739 | |
---|
| 1740 | return colspec |
---|
| 1741 | |
---|
| 1742 | class MSSQLSchemaDropper(compiler.SchemaDropper): |
---|
| 1743 | def visit_index(self, index): |
---|
| 1744 | self.append("\nDROP INDEX %s.%s" % ( |
---|
| 1745 | self.preparer.quote_identifier(index.table.name), |
---|
| 1746 | self.preparer.quote(self._validate_identifier(index.name, False), index.quote) |
---|
| 1747 | )) |
---|
| 1748 | self.execute() |
---|
| 1749 | |
---|
| 1750 | |
---|
| 1751 | class MSSQLIdentifierPreparer(compiler.IdentifierPreparer): |
---|
| 1752 | reserved_words = RESERVED_WORDS |
---|
| 1753 | |
---|
| 1754 | def __init__(self, dialect): |
---|
| 1755 | super(MSSQLIdentifierPreparer, self).__init__(dialect, initial_quote='[', final_quote=']') |
---|
| 1756 | |
---|
| 1757 | def _escape_identifier(self, value): |
---|
| 1758 | #TODO: determine MSSQL's escaping rules |
---|
| 1759 | return value |
---|
| 1760 | |
---|
| 1761 | def quote_schema(self, schema, force=True): |
---|
| 1762 | """Prepare a quoted table and schema name.""" |
---|
| 1763 | result = '.'.join([self.quote(x, force) for x in schema.split('.')]) |
---|
| 1764 | return result |
---|
| 1765 | |
---|
| 1766 | dialect = MSSQLDialect |
---|
| 1767 | dialect.statement_compiler = MSSQLCompiler |
---|
| 1768 | dialect.schemagenerator = MSSQLSchemaGenerator |
---|
| 1769 | dialect.schemadropper = MSSQLSchemaDropper |
---|
| 1770 | dialect.preparer = MSSQLIdentifierPreparer |
---|
| 1771 | |
---|