[egenix-users] Max length of (N)VARCHAR; min length of TEXT

Charles Bearden Charles.F.Bearden at uth.tmc.edu
Thu Aug 19 12:18:21 CEST 2004


> -----Original Message-----
> From: egenix-users-bounces at lists.egenix.com [mailto:egenix-users-
> bounces at lists.egenix.com] On Behalf Of Charles Bearden
> Sent: Wednesday, August 18, 2004 3:38 PM
> 
> > -----Original Message-----
> > From: M.-A. Lemburg [mailto:mal at egenix.com]
> > Sent: Wednesday, August 18, 2004 7:13 AM
> >
> > Charles Bearden wrote:
> > > Egenix-mx-commercial-2.0.6, libiodbc-3.51.2, and freetds-0.62.4 on
> RH 9,
> > > and egenix-mx-commercial-2.0.6 with Windows ODBC driver on WinXP,
> both
> > > talking to MS SQL Server 2000.
> > >
> > > When I try to insert strings longer than 255 characters into a MS
> SQL
> > > column defined as nvarchar(1024) through FreeTDS/iODBC from Linux,
I
> get
> > > the following exception:
> > >
> > > Traceback (most recent call last):
> > >   File "./convtest.py", line 32, in ?
> > >     try: mx_cur.execute(ins, t)
> > > mxODBC.OperationalError: ('', 260, "[FreeTDS][SQL
Server]Disallowed
> > > implicit conversion from data type text to data type nvarchar,
table
> > > 'IfSH_Test.dbo.pmCite',  column 'ArticleTitle'. Use the CONVERT
> function
> > > to run this query.", 4579)
> >
> > The problem here is that FreeTDS tries to convert the Python
> > 8-bit string input into a UTF-16 string. Obviously SQL Server
> > doesn't want to do this conversion implicitly (probably because
> > it doesn't know the encoding of the input string).
> >
> > Have you tried passing in the data as Unicode string ?
> 
> Thanks! That's extremely helpful to know.  I don't think I have tried
> passing in a Unicode string, or if I have there was some other place
in
> the program where it caused a UnicodeDecodeError or something.  I may
> retool the script to implement your suggestion.

When I try to use Unicode strings as arguments to an SQL statement using
placeholders and represented as a (non-Unicode) byte-string, I get a
UnicodeEncodeError like 

"UnicodeEncodeError: 'ascii' codec can't encode character u'\xf6' in
position 128: ordinal not in range(128)".  

I believe that this occurs because some part of the back end (mx.ODBC?
FreeTDS? iODBC?) is trying to encode the Unicode arguments in the same
encoding as the SQL statement appears to be in.  When I then try to use
an SQL statement that is a Python Unicode string with arguments that are
Unicode strings, I get a TypeError:

"TypeError: SQL command must be a string".  

So, the setup doesn't appear to like Unicode values with byte-string SQL
statements, and it appears not to like Unicode SQL statements.  At
present, I don't see a way to pass the data in as Python Unicode
strings.  Am I missing something?  

Thanks,
Chuck



> The solution I came up with is a bit more kludgy.  I wrote a function
> that takes the data from the '.description' attribute of the source
> database cursor, after executing a 'select * from <tablename> limit 1'
> to populate it correctly, in order to build an INSERT statmemt with
> 'CONVERT(?)' instead of simply '?' for those columns that might have
> data longer than 255 bytes:

[ugly hack ellipsis]



More information about the egenix-users mailing list