U aݛ@sdZddlmZddlZddlZddlZddlmZddl m Z ddl m Z ddl m Z dd l m Z dd l mZdd l mZdd l mZdd l mZddl mZddlmZddlmZddlmZddlmZddlmZddlmZddlmZddlmZddl m!Z!ddlm"Z"e#dZ$Gdddej%Z&Gddde Z'GdddeZ(Gd d!d!eZ)Gd"d#d#eZ*Gd$d%d%eZ+e,Z-Gd&d'd'eZ.Gd(d)d)eZ/Gd*d+d+eZ0ej1d,dd-Z2ej1d.dd-Z3ej1d/d0d-Z4ej1d1e3e4Bd-Z5Gd2d3d3eZ6e6Z7dS)4aF .. dialect:: postgresql+psycopg2 :name: psycopg2 :dbapi: psycopg2 :connectstring: postgresql+psycopg2://user:password@host:port/dbname[?key=value&key=value...] :url: https://pypi.org/project/psycopg2/ psycopg2 Connect Arguments -------------------------- Keyword arguments that are specific to the SQLAlchemy psycopg2 dialect may be passed to :func:`_sa.create_engine()`, and include the following: * ``isolation_level``: This option, available for all PostgreSQL dialects, includes the ``AUTOCOMMIT`` isolation level when using the psycopg2 dialect. This option sets the **default** isolation level for the connection that is set immediately upon connection to the database before the connection is pooled. This option is generally superseded by the more modern :paramref:`_engine.Connection.execution_options.isolation_level` execution option, detailed at :ref:`dbapi_autocommit`. .. seealso:: :ref:`psycopg2_isolation_level` :ref:`dbapi_autocommit` * ``client_encoding``: sets the client encoding in a libpq-agnostic way, using psycopg2's ``set_client_encoding()`` method. .. seealso:: :ref:`psycopg2_unicode` * ``use_native_unicode``: Under Python 2 only, this can be set to False to disable the use of psycopg2's native Unicode support. .. seealso:: :ref:`psycopg2_disable_native_unicode` * ``executemany_mode``, ``executemany_batch_page_size``, ``executemany_values_page_size``: Allows use of psycopg2 extensions for optimizing "executemany"-stye queries. See the referenced section below for details. .. seealso:: :ref:`psycopg2_executemany_mode` .. tip:: The above keyword arguments are **dialect** keyword arguments, meaning that they are passed as explicit keyword arguments to :func:`_sa.create_engine()`:: engine = create_engine( "postgresql+psycopg2://scott:tiger@localhost/test", isolation_level="SERIALIZABLE", ) These should not be confused with **DBAPI** connect arguments, which are passed as part of the :paramref:`_sa.create_engine.connect_args` dictionary and/or are passed in the URL query string, as detailed in the section :ref:`custom_dbapi_args`. .. _psycopg2_ssl: SSL Connections --------------- The psycopg2 module has a connection argument named ``sslmode`` for controlling its behavior regarding secure (SSL) connections. The default is ``sslmode=prefer``; it will attempt an SSL connection and if that fails it will fall back to an unencrypted connection. ``sslmode=require`` may be used to ensure that only secure connections are established. Consult the psycopg2 / libpq documentation for further options that are available. Note that ``sslmode`` is specific to psycopg2 so it is included in the connection URI:: engine = sa.create_engine( "postgresql+psycopg2://scott:tiger@192.168.0.199:5432/test?sslmode=require" ) Unix Domain Connections ------------------------ psycopg2 supports connecting via Unix domain connections. When the ``host`` portion of the URL is omitted, SQLAlchemy passes ``None`` to psycopg2, which specifies Unix-domain communication rather than TCP/IP communication:: create_engine("postgresql+psycopg2://user:password@/dbname") By default, the socket file used is to connect to a Unix-domain socket in ``/tmp``, or whatever socket directory was specified when PostgreSQL was built. This value can be overridden by passing a pathname to psycopg2, using ``host`` as an additional keyword argument:: create_engine("postgresql+psycopg2://user:password@/dbname?host=/var/lib/postgresql") .. seealso:: `PQconnectdbParams \ `_ .. _psycopg2_multi_host: Specifying multiple fallback hosts ----------------------------------- psycopg2 supports multiple connection points in the connection string. When the ``host`` parameter is used multiple times in the query section of the URL, SQLAlchemy will create a single string of the host and port information provided to make the connections:: create_engine( "postgresql+psycopg2://user:password@/dbname?host=HostA:port1&host=HostB&host=HostC" ) A connection to each host is then attempted until either a connection is successful or all connections are unsuccessful in which case an error is raised. .. versionadded:: 1.3.20 Support for multiple hosts in PostgreSQL connection string. .. seealso:: `PQConnString \ `_ Empty DSN Connections / Environment Variable Connections --------------------------------------------------------- The psycopg2 DBAPI can connect to PostgreSQL by passing an empty DSN to the libpq client library, which by default indicates to connect to a localhost PostgreSQL database that is open for "trust" connections. This behavior can be further tailored using a particular set of environment variables which are prefixed with ``PG_...``, which are consumed by ``libpq`` to take the place of any or all elements of the connection string. For this form, the URL can be passed without any elements other than the initial scheme:: engine = create_engine('postgresql+psycopg2://') In the above form, a blank "dsn" string is passed to the ``psycopg2.connect()`` function which in turn represents an empty DSN passed to libpq. .. versionadded:: 1.3.2 support for parameter-less connections with psycopg2. .. seealso:: `Environment Variables\ `_ - PostgreSQL documentation on how to use ``PG_...`` environment variables for connections. .. _psycopg2_execution_options: Per-Statement/Connection Execution Options ------------------------------------------- The following DBAPI-specific options are respected when used with :meth:`_engine.Connection.execution_options`, :meth:`.Executable.execution_options`, :meth:`_query.Query.execution_options`, in addition to those not specific to DBAPIs: * ``isolation_level`` - Set the transaction isolation level for the lifespan of a :class:`_engine.Connection` (can only be set on a connection, not a statement or query). See :ref:`psycopg2_isolation_level`. * ``stream_results`` - Enable or disable usage of psycopg2 server side cursors - this feature makes use of "named" cursors in combination with special result handling methods so that result rows are not fully buffered. Defaults to False, meaning cursors are buffered by default. * ``max_row_buffer`` - when using ``stream_results``, an integer value that specifies the maximum number of rows to buffer at a time. This is interpreted by the :class:`.BufferedRowCursorResult`, and if omitted the buffer will grow to ultimately store 1000 rows at a time. .. versionchanged:: 1.4 The ``max_row_buffer`` size can now be greater than 1000, and the buffer will grow to that size. .. _psycopg2_batch_mode: .. _psycopg2_executemany_mode: Psycopg2 Fast Execution Helpers ------------------------------- Modern versions of psycopg2 include a feature known as `Fast Execution Helpers \ `_, which have been shown in benchmarking to improve psycopg2's executemany() performance, primarily with INSERT statements, by multiple orders of magnitude. SQLAlchemy internally makes use of these extensions for ``executemany()`` style calls, which correspond to lists of parameters being passed to :meth:`_engine.Connection.execute` as detailed in :ref:`multiple parameter sets `. The ORM also uses this mode internally whenever possible. The two available extensions on the psycopg2 side are the ``execute_values()`` and ``execute_batch()`` functions. The psycopg2 dialect defaults to using the ``execute_values()`` extension for all qualifying INSERT statements. .. versionchanged:: 1.4 The psycopg2 dialect now defaults to a new mode ``"values_only"`` for ``executemany_mode``, which allows an order of magnitude performance improvement for INSERT statements, but does not include "batch" mode for UPDATE and DELETE statements which removes the ability of ``cursor.rowcount`` to function correctly. The use of these extensions is controlled by the ``executemany_mode`` flag which may be passed to :func:`_sa.create_engine`:: engine = create_engine( "postgresql+psycopg2://scott:tiger@host/dbname", executemany_mode='values_plus_batch') Possible options for ``executemany_mode`` include: * ``values_only`` - this is the default value. the psycopg2 execute_values() extension is used for qualifying INSERT statements, which rewrites the INSERT to include multiple VALUES clauses so that many parameter sets can be inserted with one statement. .. versionadded:: 1.4 Added ``"values_only"`` setting for ``executemany_mode`` which is also now the default. * ``None`` - No psycopg2 extensions are not used, and the usual ``cursor.executemany()`` method is used when invoking statements with multiple parameter sets. * ``'batch'`` - Uses ``psycopg2.extras.execute_batch`` for all qualifying INSERT, UPDATE and DELETE statements, so that multiple copies of a SQL query, each one corresponding to a parameter set passed to ``executemany()``, are joined into a single SQL string separated by a semicolon. When using this mode, the :attr:`_engine.CursorResult.rowcount` attribute will not contain a value for executemany-style executions. * ``'values_plus_batch'``- ``execute_values`` is used for qualifying INSERT statements, ``execute_batch`` is used for UPDATE and DELETE. When using this mode, the :attr:`_engine.CursorResult.rowcount` attribute will not contain a value for executemany-style executions against UPDATE and DELETE statements. By "qualifying statements", we mean that the statement being executed must be a Core :func:`_expression.insert`, :func:`_expression.update` or :func:`_expression.delete` construct, and not a plain textual SQL string or one constructed using :func:`_expression.text`. When using the ORM, all insert/update/delete statements used by the ORM flush process are qualifying. The "page size" for the "values" and "batch" strategies can be affected by using the ``executemany_batch_page_size`` and ``executemany_values_page_size`` engine parameters. These control how many parameter sets should be represented in each execution. The "values" page size defaults to 1000, which is different that psycopg2's default. The "batch" page size defaults to 100. These can be affected by passing new values to :func:`_engine.create_engine`:: engine = create_engine( "postgresql+psycopg2://scott:tiger@host/dbname", executemany_mode='values', executemany_values_page_size=10000, executemany_batch_page_size=500) .. versionchanged:: 1.4 The default for ``executemany_values_page_size`` is now 1000, up from 100. .. seealso:: :ref:`execute_multiple` - General information on using the :class:`_engine.Connection` object to execute statements in such a way as to make use of the DBAPI ``.executemany()`` method. .. _psycopg2_unicode: Unicode with Psycopg2 ---------------------- The psycopg2 DBAPI driver supports Unicode data transparently. Under Python 2 only, the SQLAlchemy psycopg2 dialect will enable the ``psycopg2.extensions.UNICODE`` extension by default to ensure Unicode is handled properly; under Python 3, this is psycopg2's default behavior. The client character encoding can be controlled for the psycopg2 dialect in the following ways: * For PostgreSQL 9.1 and above, the ``client_encoding`` parameter may be passed in the database URL; this parameter is consumed by the underlying ``libpq`` PostgreSQL client library:: engine = create_engine("postgresql+psycopg2://user:pass@host/dbname?client_encoding=utf8") Alternatively, the above ``client_encoding`` value may be passed using :paramref:`_sa.create_engine.connect_args` for programmatic establishment with ``libpq``:: engine = create_engine( "postgresql+psycopg2://user:pass@host/dbname", connect_args={'client_encoding': 'utf8'} ) * For all PostgreSQL versions, psycopg2 supports a client-side encoding value that will be passed to database connections when they are first established. The SQLAlchemy psycopg2 dialect supports this using the ``client_encoding`` parameter passed to :func:`_sa.create_engine`:: engine = create_engine( "postgresql+psycopg2://user:pass@host/dbname", client_encoding="utf8" ) .. tip:: The above ``client_encoding`` parameter admittedly is very similar in appearance to usage of the parameter within the :paramref:`_sa.create_engine.connect_args` dictionary; the difference above is that the parameter is consumed by psycopg2 and is passed to the database connection using ``SET client_encoding TO 'utf8'``; in the previously mentioned style, the parameter is instead passed through psycopg2 and consumed by the ``libpq`` library. * A common way to set up client encoding with PostgreSQL databases is to ensure it is configured within the server-side postgresql.conf file; this is the recommended way to set encoding for a server that is consistently of one encoding in all databases:: # postgresql.conf file # client_encoding = sql_ascii # actually, defaults to database # encoding client_encoding = utf8 .. _psycopg2_disable_native_unicode: Disabling Native Unicode ^^^^^^^^^^^^^^^^^^^^^^^^ Under Python 2 only, SQLAlchemy can also be instructed to skip the usage of the psycopg2 ``UNICODE`` extension and to instead utilize its own unicode encode/decode services, which are normally reserved only for those DBAPIs that don't fully support unicode directly. Passing ``use_native_unicode=False`` to :func:`_sa.create_engine` will disable usage of ``psycopg2.extensions. UNICODE``. SQLAlchemy will instead encode data itself into Python bytestrings on the way in and coerce from bytes on the way back, using the value of the :func:`_sa.create_engine` ``encoding`` parameter, which defaults to ``utf-8``. SQLAlchemy's own unicode encode/decode functionality is steadily becoming obsolete as most DBAPIs now support unicode fully. Transactions ------------ The psycopg2 dialect fully supports SAVEPOINT and two-phase commit operations. .. _psycopg2_isolation_level: Psycopg2 Transaction Isolation Level ------------------------------------- As discussed in :ref:`postgresql_isolation_level`, all PostgreSQL dialects support setting of transaction isolation level both via the ``isolation_level`` parameter passed to :func:`_sa.create_engine` , as well as the ``isolation_level`` argument used by :meth:`_engine.Connection.execution_options`. When using the psycopg2 dialect , these options make use of psycopg2's ``set_isolation_level()`` connection method, rather than emitting a PostgreSQL directive; this is because psycopg2's API-level setting is always emitted at the start of each transaction in any case. The psycopg2 dialect supports these constants for isolation level: * ``READ COMMITTED`` * ``READ UNCOMMITTED`` * ``REPEATABLE READ`` * ``SERIALIZABLE`` * ``AUTOCOMMIT`` .. seealso:: :ref:`postgresql_isolation_level` :ref:`pg8000_isolation_level` NOTICE logging --------------- The psycopg2 dialect will log PostgreSQL NOTICE messages via the ``sqlalchemy.dialects.postgresql`` logger. When this logger is set to the ``logging.INFO`` level, notice messages will be logged:: import logging logging.getLogger('sqlalchemy.dialects.postgresql').setLevel(logging.INFO) Above, it is assumed that logging is configured externally. If this is not the case, configuration such as ``logging.basicConfig()`` must be utilized:: import logging logging.basicConfig() # log messages to stdout logging.getLogger('sqlalchemy.dialects.postgresql').setLevel(logging.INFO) .. seealso:: `Logging HOWTO `_ - on the python.org website .. _psycopg2_hstore: HSTORE type ------------ The ``psycopg2`` DBAPI includes an extension to natively handle marshalling of the HSTORE type. The SQLAlchemy psycopg2 dialect will enable this extension by default when psycopg2 version 2.4 or greater is used, and it is detected that the target database has the HSTORE type set up for use. In other words, when the dialect makes the first connection, a sequence like the following is performed: 1. Request the available HSTORE oids using ``psycopg2.extras.HstoreAdapter.get_oids()``. If this function returns a list of HSTORE identifiers, we then determine that the ``HSTORE`` extension is present. This function is **skipped** if the version of psycopg2 installed is less than version 2.4. 2. If the ``use_native_hstore`` flag is at its default of ``True``, and we've detected that ``HSTORE`` oids are available, the ``psycopg2.extensions.register_hstore()`` extension is invoked for all connections. The ``register_hstore()`` extension has the effect of **all Python dictionaries being accepted as parameters regardless of the type of target column in SQL**. The dictionaries are converted by this extension into a textual HSTORE expression. If this behavior is not desired, disable the use of the hstore extension by setting ``use_native_hstore`` to ``False`` as follows:: engine = create_engine("postgresql+psycopg2://scott:tiger@localhost/test", use_native_hstore=False) The ``HSTORE`` type is **still supported** when the ``psycopg2.extensions.register_hstore()`` extension is not used. It merely means that the coercion between Python dictionaries and the HSTORE string format, on both the parameter side and the result side, will take place within SQLAlchemy's own marshalling logic, and not that of ``psycopg2`` which may be more performant. )absolute_importN)UUID)_DECIMAL_TYPES) _FLOAT_TYPES) _INT_TYPES)ENUM) PGCompiler) PGDialect)PGExecutionContext)PGIdentifierPreparer)HSTORE)JSON)JSONB)exc) processors)types)util)cursor)elements)collections_abczsqlalchemy.dialects.postgresqlc@seZdZddZddZdS) _PGNumericcCsdSNselfdialectrrhC:\Users\vtejo\AppData\Local\Temp\pip-unpacked-wheel-nyjtotrf\sqlalchemy\dialects\postgresql\psycopg2.pybind_processorsz_PGNumeric.bind_processorcCsv|jrB|tkrttj|jS|tks.|tkr2dSt d|n0|tkrNdS|tks^|tkrdtj St d|dS)NzUnknown PG numeric type: %d) Z asdecimalrrZto_decimal_processor_factorydecimalDecimalZ_effective_decimal_return_scalerrrZInvalidRequestErrorZto_floatrrcoltyperrrresult_processors$z_PGNumeric.result_processorN__name__ __module__ __qualname__rr$rrrrrsrcseZdZfddZZS)_PGEnumcs(tjr|jdkrd|_tt|||S)NTZ force_nocheck)rpy2kZ_expect_unicodesuperr)r$r" __class__rrr$s z_PGEnum.result_processor)r&r'r(r$ __classcell__rrr,rr)sr)cs(eZdZfddZfddZZS) _PGHStorecs|jr dStt||SdSr)_has_native_hstorer+r/rrr,rrr"sz_PGHStore.bind_processorcs |jr dStt|||SdSr)r0r+r/r$r"r,rrr$(sz_PGHStore.result_processor)r&r'r(rr$r.rrr,rr/!s r/c@seZdZddZdS)_PGJSONcCsdSrrr"rrrr$0sz_PGJSON.result_processorNr&r'r(r$rrrrr1/sr1c@seZdZddZdS)_PGJSONBcCsdSrrr"rrrr$5sz_PGJSONB.result_processorNr2rrrrr34sr3c@seZdZddZddZdS)_PGUUIDcCs|js|jrdd}|SdS)NcSs|dk rt|}|Sr) _python_UUIDvaluerrrprocess=sz'_PGUUID.bind_processor..processZas_uuiduse_native_uuid)rrr8rrrr:s z_PGUUID.bind_processorcCs|js|jrdd}|SdS)NcSs|dk rt|}|Sr)strr6rrrr8Gsz)_PGUUID.result_processor..processr9)rrr#r8rrrr$Ds z_PGUUID.result_processorNr%rrrrr49s r4c@s(eZdZdZddZddZddZdS)PGExecutionContext_psycopg2NcCs6dtt|ddttddf}|j|S)Nzc_%s_%s)hexid_server_side_idZ_dbapi_connectionr)ridentrrrcreate_server_side_cursorUs*z5PGExecutionContext_psycopg2.create_server_side_cursorcCs<|jr,|jr,|jjr,tj}||j|jd|_||jdS)N)Zinitial_buffer)_psycopg2_fetched_rowscompiled returning_cursorZ FullyBufferedCursorFetchStrategyrZcursor_fetch_strategy _log_notices)rZ strat_clsrrr post_exec[s z%PGExecutionContext_psycopg2.post_execcCsL|jjrt|jjtjsdS|jjD]}t|q$g|jjdd<dSr) connectionZnotices isinstancerIterableloggerinforstrip)rrnoticerrrrGns  z(PGExecutionContext_psycopg2._log_notices)r&r'r(rCrBrHrGrrrrr<Rsr<cseZdZdfdd ZZS)PGCompiler_psycopg2Fc sntt|j|fd|i|}|sj|jjs0|jjrj|j|j}|jrj|dt |j |fd|i|f7}|S)Nskip_bind_expressionz::%s) r+rPvisit_bindparamtypeZ _is_arrayZ_is_type_decoratorZ_unwrapped_dialect_implrrZ TypeClauseZ_compiler_dispatch)rZ bindparamrQkwtexttypr,rrrRs0  z#PGCompiler_psycopg2.visit_bindparam)F)r&r'r(rRr.rrr,rrPsrPc@s eZdZdS)PGIdentifierPreparer_psycopg2N)r&r'r(rrrrrWsrWZexecutemany_plain) canonicalZexecutemany_batchexecutemany_valuesr=Zexecutemany_values_plus_batchcs0eZdZdZdZejrdZdZdZ dZ e Z e ZeZdZdZejdejiZeejejeeeejeee e!e"ej!e"e#e$e%e&iZd-d d Z'fd dZ(e)ddZ*e)ddZ+e)ddZ,ej-ddZ.ddZ/ddZ0ddZ1ddZ2dd Z3d!d"Z4d#d$Z5d.d%d&Z6ej7d'd(Z8d)d*Z9d+d,Z:Z;S)/PGDialect_psycopg2psycopg2TFZpyformat)rruse_native_unicodeN values_onlydc Kstj|f|||_|s(tjs(td|s2d|_||_||_ ||_ ||_ tj |tdgtdgtdgtddgid|_|jt@rd|_||_||_|jrt|jd rtd |jj} | rtd d | d ddD|_|jdkrtddS)Nz7psycopg2 native_unicode mode is required under Python 3Fbatchr]Zvalues_plus_batchvaluesexecutemany_modeT __version__z(\d+)\.(\d+)(?:\.(\d+))?css|]}|dk rt|VqdSr)int).0xrrr sz.PGDialect_psycopg2.__init__..rr=r)r=z+psycopg2 version 2.7 or higher is required.)r __init__r\rr*r ArgumentErrorr0use_native_hstorer:Zsupports_unicode_bindsclient_encodingsymbolZparse_user_argumentEXECUTEMANY_PLAINEXECUTEMANY_BATCHEXECUTEMANY_VALUESEXECUTEMANY_VALUES_PLUS_BATCHrbinsert_executemany_returningexecutemany_batch_page_sizeexecutemany_values_page_sizedbapihasattrrematchrctuplegrouppsycopg2_version ImportError) rr\rlrkr:rbrsrtkwargsmrrrrisN     zPGDialect_psycopg2.__init__csLtt|||jo$||jdk |_|js:d|_t |_ |j t @ |_ dS)NF) r+rZ initializerk _hstore_oidsrIr0Zfull_returningrrrnrbrosupports_sane_multi_rowcountrrIr,rrrszPGDialect_psycopg2.initializecCs ddl}|S)Nr)r[)clsr[rrrruszPGDialect_psycopg2.dbapicCsddlm}|S)Nr extensions)r[r)rrrrr_psycopg2_extensionss z'PGDialect_psycopg2._psycopg2_extensionscCsddlm}|S)Nrextras)r[r)rrrrr_psycopg2_extras#s z#PGDialect_psycopg2._psycopg2_extrascCs"|}|j|j|j|j|jdS)N)Z AUTOCOMMITzREAD COMMITTEDzREAD UNCOMMITTEDzREPEATABLE READZ SERIALIZABLE)rZISOLATION_LEVEL_AUTOCOMMITZISOLATION_LEVEL_READ_COMMITTEDZ ISOLATION_LEVEL_READ_UNCOMMITTEDZISOLATION_LEVEL_REPEATABLE_READZISOLATION_LEVEL_SERIALIZABLE)rrrrr_isolation_lookup)sz$PGDialect_psycopg2._isolation_lookupc Cspz|j|dd}WnJtk r`}z,tjtd||jd|jf|dW5d}~XYnX| |dS)N_ zLInvalid value '%s' for isolation_level. Valid isolation levels for %s are %sz, )Zreplace_context) rreplaceKeyErrorrZraise_rrjnamejoinset_isolation_level)rrIlevelerrrrrr4s z&PGDialect_psycopg2.set_isolation_levelcCs ||_dSrreadonlyrrIr7rrr set_readonlyCszPGDialect_psycopg2.set_readonlycCs|jSrrrrrr get_readonlyFszPGDialect_psycopg2.get_readonlycCs ||_dSrZ deferrablerrrrset_deferrableIsz!PGDialect_psycopg2.set_deferrablecCs|jSrrrrrrget_deferrableLsz!PGDialect_psycopg2.get_deferrablec Csd}z:d|_|}z||jW5||js:d|_XWn@|jjk r~}z||||rlWY dSW5d}~XYnXdSdS)NTF) Z autocommitrcloseclosedexecuteZ_dialect_specific_select_oneruError is_disconnect)rZdbapi_connectionrrrrrdo_pingOs  zPGDialect_psycopg2.do_pingcsgjdk r4fdd}|jdk rTfdd}|jrvjrvfdd}|tjrjrj rfdd}|jrˆj r‡fdd}|jrj rfdd}|rfdd}|SdSdS) Ncs|jdSr)Zset_client_encodingrlconnrrr on_connectisz1PGDialect_psycopg2.on_connect..on_connectcs|jdSr)risolation_levelrrrrrpscsd|dSr)Z register_uuidrrrrrwscs j|j|dSr)Z register_typeUNICODEZ UNICODEARRAYrrrrr~scsJ|}|dk rF|\}}d|i}tjr0d|d<||d<j|f|dS)NoidTunicode array_oid)rrr*Zregister_hstore)rZ hstore_oidsrrrTrrrrrs cs$j|jdj|jddS)N)loads)Zregister_default_json_json_deserializerZregister_default_jsonbrrrrrscsD] }||qdSrr)rfn)fnsrrrs) rrrlappendrrur:rr*r\rkr)rrr)rrrrrrbs2                zPGDialect_psycopg2.on_connectcCs|jt@rH|rH|jrH|jjrHd|jj}|js:||j}||krLd}nd}|r||d}|j rnd|j i}ni}| }|j |||f|t |jj d||_nD|jt@r|jrd|ji}ni}| j|||f|n |||dS)Nz(%s)z%sZ page_size)templatefetch)rbrpZisinsertrDZinsert_single_values_exprsupports_unicode_statementsencodeencodingrrtrZexecute_valuesboolrErCrorsZ execute_batchZ executemany)rrZ statement parameterscontextrYr}Zxtrasrrrdo_executemanysV       z!PGDialect_psycopg2.do_executemanycCsH|}t|dr|j}|j|}|dk r@|dr@|ddSdSdS)NrIrr=)rrvrIZ HstoreAdapterZget_oids)rrrZoidsrrrrs   zPGDialect_psycopg2._hstore_oidscCs|jdd}d}d|jkr.t|jdttf}|rvd|krJt|d|d<||j|rnd|jd|d<g|fS|jr||j|rd|jd|d<g|fSdg|fSdS)Nuser)usernameFhostport,)Ztranslate_connect_argsqueryrJlistryrdupdater)rurloptsZ is_multihostrrrcreate_connect_argss"    z&PGDialect_psycopg2.create_connect_argscCsft||jjrbt|ddrdSt|dd}dD],}||}|dkr4d|d|kr4dSq4dS)NrFT r) zterminating connectionzclosed the connectionzconnection not openz"could not receive data from serverzcould not send data to serverzconnection already closedzcursor already closedz!losed the connection unexpectedlyz'connection has been closed unexpectedlyz&SSL SYSCALL error: Bad file descriptorzSSL SYSCALL error: EOF detectedz.SSL error: decryption failed or bad record macz&SSL SYSCALL error: Operation timed out")rJrurgetattrr; partitionfind)rerIrZstr_emsgidxrrrrs  z PGDialect_psycopg2.is_disconnect)TNTTr]r^r_)N)sdQ                      .