Код ошибки 0xc0047020

  • Remove From My Forums
  • Question

  • Hi all,

      I have an SSIS Package which loads fro FlatFile Source to SQL Server Destination(using OLEDB Destination). The Flat File has almost 40 Columns(~90k Records). I am using 15 Derived Columns and 15 Unionall components in DataFlow. 

    The issue is when i running on local machine it was running fine. When i deployed and start running as Job on server it was failing due to this error .I am not using any Package Configurations.

    «The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.»

    So can any face the similar issue. Do i need to use extra steps while deploying.Any help is appreciated

Answers

  • Hi Guys ,

       Thanks for your valuable suggestions. Actually my problem got resolved on that day by changing Default BufferSize Propertie of DataflowTask. I increased it by 10 times and it starts working.

    • Proposed as answer by

      Friday, March 16, 2012 5:41 PM

    • Marked as answer by
      Eileen Zhao
      Tuesday, March 27, 2012 12:43 PM

    • Edited by
      Eileen Zhao
      Wednesday, March 14, 2012 8:59 AM
    • Marked as answer by
      Eileen Zhao
      Tuesday, March 27, 2012 12:43 PM

I have problems with SSIS process(acctually the same problem occurs for two different processes). So, we are doing some ETL work using SSIS. We have Business Intelligence project that executes without errors from Visual Studio. However when its deployed on the server and scheduled as a job it fails with errors like:

INTRASTAT_Towar:Error: SSIS Error Code
DTS_E_PROCESSINPUTFAILED. The ProcessInput
method on component "Union All 3" (876) failed with error
code 0xC0047020 while processing input "Union All Input
2" (916). The identified component returned an error from
the ProcessInput method. The error is specific to the
component, but the error is fatal and will cause the Data
Flow task to stop running. There may be error messages
posted before this with more information about the failure.

INTRASTAT_Towar:Error: SSIS Error Code
DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput
method on istat_dekl_wynik_poz SELECT returned error
code 0xC02020C4. The component returned a failure
code when the pipeline engine called PrimeOutput(). The
meaning of the failure code is defined by the component,
but the error is fatal and the pipeline stopped executing.
There may be error messages posted before this with
more information about the failure.

INTRASTAT_Towar:Error: The attempt to add a row to the
Data Flow task buffer failed with error code 0xC0047020.**

The other returned errors are similar, sometimes instead of 0xC0047020, there is unspecified error. Errors occur only on this one table, that has a lot of different stuff inside data flow task(unions, multicasts, conditional split). Most other data flows have just source, dest and transformation, and they are not causing any problems. I’ve been suggested to try manipulating DefaultBufferMaxRows and DefaultBufferSize properties values for the data flow task, but after doing some research I dont belive that it will solve the issue, as they are currently put on default values. Any suggestions?

  • Remove From My Forums
  • Question

  •  Hi All,

    I have a package that transform data from OLEDB Source TableA to OLEDBDestination TableB.
    The count of records in TableA is around 7.500.000 records.
    When I run that package, the package run well, but after around 200.000 records have been inserted to TableB, the package got error, and TableA become Red. This is the summary of the error messages :

    Unspecified error

    SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component «Union All 11» (16989) failed with error code 0xC0047020 while processing input «Union All Input 2» (17111). The identified component returned an error from the ProcessInput method.
    The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information about the failure.

    The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020

    I’ve tried to search that problem, and I read in forum that someone suggest to change the Data Access Mode for OLEDB Destination From «Table/View Fast Load» to «Table/View». I’ve tried that, but the performance become very slow, and I still got the same
    error after around 300.000 records have been inserted to TableB.

    Does anybody know what reason that might cause that error?

    Thank you

Answers

  • Hi All,

    I got this error on the source and destination, not because of the transformation (Union All Component).

    Sorry, I wanted to summarize the error messages because the log event displayed the same error message for several times, but I missed something, this is the error messages :

    The ProcessInput method on component «Union All 12» (8507) failed with error code 0xC0047020 while processing input «Union All Input 1» (8508). The identified component returned an error from the ProcessInput method. The error is specific to the component,
    but the error is fatal and will cause the Data Flow task to stop running.  There may be error messages posted before this with more information about the failure.

     INSERT TABLEA TO TABLEB {60378B3F-72AA-4B1B-BFB4-6ED915AB7B54} {A2115E0C-B659-4865-8D5E-1192ACB1319F} SSIS Error Code DTS_E_PRIMEOUTPUTFAILED.  The PrimeOutput method on component «TABLEA» (1) returned error code 0xC02020C4. 
    The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.  There may be error messages posted before this
    with more information about the failure.

     INSERT TABLEA TO TABLEB {60378B3F-72AA-4B1B-BFB4-6ED915AB7B54} {A2115E0C-B659-4865-8D5E-1192ACB1319F} The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.

    But now that errors can be solved by using SQL Server Destination instead of using OLEDB Destination, and set the TIME OUT to higher value (for ex : 500)

    But I’m still wondering why I need to do this. It seems that the connection to the destination database timed out. Is it possible because I have quite complicated transformation and it takes time to pass the records from Source to Destination?

    And We can’t use SQL Server Destination if the Destination database is not in the same server with our package, right? Can we set TIME OUT in OLEDB Destination?, I have tried to set properties in OLEDB Destination ROW_PER_BATCH = 100 and MAXIMUM_INSERT_COMMIT=100,
    but I still got that error messages :(

    Thank you for your help.

    Regards,

    Ratna

    • Marked as answer by

      Tuesday, July 26, 2011 1:17 AM

  • Remove From My Forums
  • Question

  • Hello,

    I am getting errors on a process running fine for six months now… and today it is failing…

    [Destination Table - stageFactStateDaily [49]] Error: The attempt to send a row to SQL Server failed with error code 0x80004005.
    [SSIS.Pipeline] Error: SSIS Error Code DTS_E_PROCESSINPUTFAILED.  The ProcessInput method on component "Destination Table - stageFactStateDaily" (49) failed with error code 0xC02020C7 while processing input "SQL Server Destination Input" (65). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.  There may be error messages posted before this with more information about the failure.
    [Source Query - stageFactStateDaily [1]] Error: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.
    [SSIS.Pipeline] Error: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED.  The PrimeOutput method on component "Source Query - stageFactStateDaily" (1) returned error code 0xC02020C4.  The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.  There may be error messages posted before this with more information about the failure.

    select
    	ph.[DateTime]
    	,convert(int, ph.[DateTime]) as [DateKey]
    	,datepart(hh, ph.[DateTime]) as [HourKey]
    	,convert(bigint, convert(varchar, ph.PerformanceRuleInstanceRowId) + convert(varchar, ri.RuleRowId) + convert(varchar, ph.ManagedEntityRowId)) as PerfRuleKey
    	,ph.[PerformanceRuleInstanceRowId]
    	,ph.[ManagedEntityRowId]
    	,ph.[SampleCount]
    	,ph.[AverageValue]
    	,ph.[MinValue]
    	,ph.[MaxValue]
    	,ph.[StandardDeviation]
    from [Perf].[vPerfHourly] ph
    inner join [dbo].[vPerformanceRuleInstance] ri on
    	ph.[PerformanceRuleInstanceRowId] = ri.[PerformanceRuleInstanceRowId]

    I have so far 123,710,550 rows treated by this query…

    Any idea?
    Thanks
    Dom


    System Center Operations Manager 2007 / System Center Configuration Manager 2007 R2 Support

    • Moved by

      Wednesday, 30 December 2009 12:08 AM
      wrong forum (From:Monitoring and Analytics)


  • Hi,

    I am trying to transfer the data from flat file to sql server.When I am running the package on local server(network server) it works fine.But when I user it to transfer the data to online server it starts and shows 2771 rows transfered and remains on that only. when i stop the execution I get the following errors:

    [DTS.Pipeline] Error: The pipeline received a request to cancel and is shutting down.

    [Loose Diamond File [1]] Error: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.

    [DTS.Pipeline] Error: The PrimeOutput method on component «Loose Diamond File» (1) returned error code 0xC02020C4.  The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.

    [DTS.Pipeline] Error: Thread «SourceThread0» has exited with error code 0xC0047038.

    Can any one help me to find what the problem is.

    Thanks in advance.

  • August 14, 2006 at 8:00 am

    This was removed by the editor as SPAM

  • subsonic-abercrombie

    Mr or Mrs. 500

    Points: 520

    I wonder if we should check the inputrow buffer for a null value and set a flag to redirect the row to an error table of some sort.

  • peter.stanford

    SSC Enthusiast

    Points: 101

    January 20, 2008 at 5:22 pm

    #770022

    This is not a solution but another example of the same error.

    I’m writing a package that compares two tables of the same name in different databases. If the data exists in both tables I want to update the destination table with changes from the source table.

    If the data does not exist in the destination table, I want to insert it. I’ve written a query to select the correct data (SSIS Lookup Query.jpg) and tested that it works (SSIS Lookup Preview.jpg).

    I’m new to this so I don’t know whether the lookup should be a SELECT or UPDATE but other than that I would have thought this should work?

  • smurug

    SSC Enthusiast

    Points: 113

    Has any one having any clue?. I’m just stuck up with this without any clue? I’m having the same problem as Grosshopper said.

  • asellmech

    SSC Enthusiast

    Points: 111

    Hello, i just had the exact same problem and just stop using the «fast-load» mode and it is working just fine now, hope it helps.

    Regards,

  • ksk.sep19

    SSC Journeyman

    Points: 84

    i did the same one just by changing the ‘table or view fast load’ to ‘table or view’ mode. its working fine now

  • vasanti mestri

    Right there with Babe

    Points: 786

    hey…

    i was also facing same problem for same task….

    so finally i tried this query in execute task of ssis and it works properly.

    INSERT table2( col1, col2, col3 )

    SELECT A. col1, A. col2, A. col3

    FROM table1 A

    LEFT JOIN table2 B ON A.key_PK = B.key_PK

    WHERE B.key_PK IS NULL

  • fred_lauer

    SSC Enthusiast

    Points: 113

    This happens when you auto-generate a Data Flow step with the import/export wizard. The process doesn’t distinguish between a view and a table, so it will copy views in as a table. Make sure to check your source and destination steps that may be trying to copy a view to a view.

  • ratna_ain

    Say Hey Kid

    Points: 698

    I also had same problem.

    Have all of you found the solution?

    I’ve tried to change the Data Access Mode of Oledb Destination to from «Table/View Fast Load» To «Table/View». but I still got the same error :crazy:

    At the beginning of execution,my package run well, I got the error message in the middle of execution and some records have been inserted successfully to destination table.

    Thanks a lot.

  • derald-764731

    SSC Enthusiast

    Points: 146

    In my case the problem was only on the items in the data flow task where the «[highlight=#ffff11]Check Constraints[/highlight]» option was selected for the OLE DB Destination. Once I change that, i had no issues. In my case I did not want the check, but your case might be different.

    Regards,

    Derald

  • feersum_endjinn

    SSCommitted

    Points: 1966

    In my case this was caused due to the following properties

    [Data Flow Task]

    DefaultBufferMaxRows

    DefaultBufferSize

    [SQLServerDestination]

    MaxInsertCommitSize

    You need to play around with those three parameters and get the correct values for your hardware/software/database configuration. First try the MaxInsertCommitSize parameter — this is set to 0 by default which means during a bulk insert that it performs one transaction (COMMIT) for the bulk insert. This might cause buffer issues if you don’t have enough memory for a large dataset. In my case it was a transfer of 11 million rows that failed and I have now set the parameter to 250,000. MaxRows is set to 10,000 and BufferSize is set to 104857600 (100MB). As the articles below advise, try to avoid swapping to disc at all costs.

    Read these articles for some very useful tips:

    Top 10 SQL Server Integration Services Best Practices

    Improving the Performance of the Data Flow

  • blackadder

    SSC Veteran

    Points: 204

    isn’t it the case that excel is full? (64000 rows exceeded?)

  • Koen Verbeeck

    SSC Guru

    Points: 259075

    Sander Stuurwold-385722 (2/21/2012)


    isn’t it the case that excel is full? (64000 rows exceeded?)

    Remember that the latest post in this thread is already a few months old.

    Furthermore, Excel is nowhere mentioned in this thread — the OP is talking about a flat file — so no, it probably isn’t the case that the Excel is full.

  • feersum_endjinn

    SSCommitted

    Points: 1966

    I can’t see how this can be an Excel error when Excel is not mentioned anywhere by the OP and also the OP is importing data to SQL Server.

    As I said in my earlier reply what worked for me was to play around with those three settings. A few months after this post I started getting another related error for the largest of the source tables. In the end I had to change the insert from a SQL Server Destination / BULK insert to a OLEDB Destination and the source was a select statement (from the same table as before) but with a SET ANSI_WARNINGS OFF at the start. For some reason although the error is in fact a warning, SSIS stops executing the task and logs it as a task error.

Viewing 15 posts — 1 through 15 (of 29 total)

Понравилась статья? Поделить с друзьями:
  • Код ошибки 0xc004
  • Код ошибки 0xc0020036 windows 10
  • Код ошибки 0xc000185 windows 10
  • Код ошибки 0xc0000906 при запуске игры
  • Код ошибки 0xc0000804