Postgres "out of memory for query result" error moving bytea

More
5 years 8 months ago #12646 by gerber
gerber created the topic: Postgres "out of memory for query result" error moving bytea
I am consistently receiving the following error when attempting to move data from PostgreSQL 9.0 to SQL Server 2005 with SSIS 2005:
[OLE DB Source [1]] Error: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80040E14. An OLE DB record is available. Source: "PostgreSQL Native Provider" Hresult: 0x80040E14 Description: "out of memory for query result ".

The process starts the Postgres source and churns for about 5 minutes before failing with the "out of memory".

I am using a "full" trial version which expires in a few more weeks, "PGNP-Postgres-SE-Trial-1.4.0.3301.exe", installed on Windows Server 2003 Rs Enterprise SP2 - 8GM RAM. This server has only SSIS 2005 installed, no SQL Server service. It connects to the source PostgreSQL DB via the pgoledb driver, and connects to a separate SQL Server database on another server via oledb destination.

Package TEST_BLOB
Validation has started
[DTS.Pipeline] Information: Validation phase is beginning.
Progress: Validating - 0 percent complete
Progress: Validating - 50 percent complete
Progress: Validating - 100 percent complete
Validation is completed (2)
Start, 8:36:22 AM
[DTS.Pipeline] Information: Validation phase is beginning.
Progress: Validating - 0 percent complete
Progress: Validating - 50 percent complete
Progress: Validating - 100 percent complete
[DTS.Pipeline] Information: Prepare for Execute phase is beginning.
Progress: Prepare for Execute - 0 percent complete
Progress: Prepare for Execute - 50 percent complete
Progress: Prepare for Execute - 100 percent complete
[DTS.Pipeline] Information: Pre-Execute phase is beginning.
Progress: Pre-Execute - 0 percent complete
[OLE DB Source [1]] Error: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80040E14. An OLE DB record is available. Source: "PostgreSQL Native Provider" Hresult: 0x80040E14 Description: "out of memory for query result ".
[DTS.Pipeline] Error: component "OLE DB Source" (1) failed the pre-execute phase and returned error code 0xC0202009.
Progress: Pre-Execute - 50 percent complete
[DTS.Pipeline] Information: Cleanup phase is beginning.
Progress: Cleanup - 0 percent complete
Progress: Cleanup - 50 percent complete
Progress: Cleanup - 100 percent complete
[DTS.Pipeline] Information: "component "OLE DB Destination" (40)" wrote 0 rows.
Task Data Flow Task failed
Finished, 8:38:44 AM, Elapsed time: 00:02:21.969


Here is DDL for the source and destination tables, I've removed all the extra columns and left just the PK and BLOB (bytea/image) data:

CREATE TABLE [dbo].[pgoledb_test_blob](
[image_id] [int] NOT NULL,
[image_type] [nvarchar](100) NOT NULL,
[image_data] [image] NOT NULL,
CONSTRAINT [pk_pgoledb_test_blob] PRIMARY KEY CLUSTERED
(
[image_id] ASC,
[image_type] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO


CREATE TABLE pgoledb_test_blob
(
image_id integer NOT NULL,
image_type character varying(100) NOT NULL,
image_data bytea NOT NULL,
CONSTRAINT pk_pgoledb_test_blob PRIMARY KEY (image_id, image_type)
)
WITH (
OIDS=FALSE
);

Here is information on the Postgres source table, not really a big table at all, not even close to being the largest we need to transform:
Sequential Scans 10
Sequential Tuples Read 972890
Index Scans 0
Index Tuples Fetched 0
Tuples Inserted 118679
Tuples Updated 0
Tuples Deleted 0
Tuples HOT Updated 0
Live Tuples 118679
Dead Tuples 0
Heap Blocks Read 8522
Heap Blocks Hit 122855
Index Blocks Read 920
Index Blocks Hit 301608
Toast Blocks Read 3466557
Toast Blocks Hit 3542508
Toast Index Blocks Read 32885
Toast Index Blocks Hit 8041959
Last Vacuum
Last Autovacuum
Last Analyze 2014-11-10 08:32:21.741411-05
Last Autoanalyze 2014-11-07 16:47:37.41419-05
Table Size 7128 kB
Toast Table Size 3941 MB
Indexes Size 3680 kB


Some settings I've changed in an attempt to eliminate the "out of memory" on the PostgreSQL Server side:
(The process is failing before the first row is ever written to SQL Server, so it appears the issue is solely on the PostgreSQL database connection side.)

set Fetch Rows to 1 - memory error
set Fetch Rows to 10 - memory error
set Fetch Rows to 100 - memory error (default)
set Fetch Rows to 1000 - memory error

unchecked TextAsLongVarChar - memory error
checked TextAsLongVarChar (default) - memory error

Bulk Insert Method - value sets (default) - memory error
Bulk Insert Method - pipe/copy - memory error
Bulk Insert Method - copy commanbd - memory error

Use Cursor - checked (default) - memory error
Use Cursor - unchecked - memory error

set SSIS max insert commit size to 100 - memory error

Data Source=sbspgint;User ID=cepc;Initial Catalog=cmn_cepc_target_1;Provider=PGNP.1;Persist Security Info=True;Extended Properties="PORT=5433;MIN_FETCH_ROWS=10;SSL=allow;";

added "MIN_FETCH_ROWS=10;" to extended properties - memory error

changed Fetch Rows to 2000 to match min_fetch_rows - memory error


Any thoughts/suggestions anything that may be causing the Postgres oledb connection to fail would be greatly appreciated. Thanks!
Attachments:

Please Log in or Create an account to join the conversation.

Time to create page: 0.075 seconds
Powered by Kunena Forum