Why is its substantialy slower to load 50GB of gzipped file (20GB gzipped file) then loading 50GB unzipped data? im using System.IO.Compression.GZipStream and its not maxing out the cpu while loading the gzip data! Im using the default buffer of the stream

dotnet framework

    Next

  • 1. Connection Pooling with Microsoft Access?
    Hello, Is it possible to connection pool with Microsoft Access? I am coding in VB.NET and my application is giving me errors because the connection is still open and fetching records. Thanks!
  • 2. How to get all result sets and all error messages from MS SQL 2000?
    There is a stored procedure with following code: ... RAISERROR 30000 'xxxx' SELECT 1 as F1 RAISERROR 30000 'zzzz' ... In C# I have something like that: .... string _exceptionMessage = ""; SqlCommand cmd = new SqlCommand(sql,conn); try { SqlDataReader rdr = cmd.ExecuteReader(); do { while(rdr.Read()) { //do something .... } } while(rdr.NextResult()); } catch(SqlException e) { for(int j=0;j<e.Errors.Count;j++) { _exceptionMessage += e.Errors[j].Number.ToString()+" - "+e.Errors[j].Message+"\r\n"; } } .... So I want to get two error messages and one result set. But I got exception in ExecuteReader() so I didn't get access to DataReader. And I don't understand how to get everything from sql server. Even I can't get all error messages because within catch I always have e.Errors.Count equal 1. When I run my stored procedure in QueryAnalyzer I have no problem. I can see everything as output: Server: Msg 30000, Level 16, State 1, Procedure CompanyGetInfo, Line 31 xxxx F1 ----------- 1 (1 row(s) affected) Server: Msg 30000, Level 16, State 1, Procedure CompanyGetInfo, Line 33 zzzz Any help will be appreciated. In the best we trust George Nevsky
  • 3. dataset, datagrid, relation ...
    Hello, I 'm new in aspnet and adonet programming ... does anybody know how to bind a complexe dataset (multi-tables, multi-relations) in different datagrids in web forms? for intance : datagrid1 -> customers, datagrid2 -> products, datagrid3 -> companies (shops) , datagrid4 -> invoices , datagrid5 -> payment ... I'm not sure that the best idea is to use dataset and relations ... Please I need help, examples, web sites, source codes, explanations are welkomed !! Thank you Sophie
  • 4. SQL Server Connection Problem
    Hi All, I am new to ASP.net and am having a problem connecting to SQL. Everytime i try to open the connection i get the following error: 'ResourcePool' is not supported on the current platform. Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code. Exception Details: System.PlatformNotSupportedException: 'ResourcePool' is not supported on the current platform. Source Error: Line 68: Dim oConn As New SqlConnection Line 69: oConn.ConnectionString = "Server=CraigDev\Development01;Database=DDAContact;User Id=**;Password=***" Line 70: oConn.Open() Line 71: Line 72: Try The above connection string is what i use in my vb6 projects and works ok in vb 6. The above call falls over on line 70 every time and i haven't got a clue what it is. Thanks in advance Craig

Why is its substantialy slower to load 50GB of gzipped file (20GB gzipped file) then loading 50GB unzipped data? im using System.IO.Compression.GZipStream and its not maxing out the cpu while loading the gzip data! Im using the default buffer of the stream

Postby DR » Sat, 06 Sep 2008 07:53:29 GMT

Why is its substantialy slower to load 50GB of gzipped file (20GB gzipped 
file) then loading 50GB unzipped data? im using 
System.IO.Compression.GZipStream and its not maxing out the cpu while 
loading the gzip data! Im using the default buffer of the stream that i open 
on the 20GB gzipped file and pass it into the GZipStream ctor. then 
System.IO.Compression.GZipStream takes an hour! when just loading 50GB file 
of data takes a few minutes! 



Similar Threads:

1.Why is its substantialy slower to load 50GB of gzipped file (20GB gzipped file) then loading 50GB unzipped data? im using System.IO.Compression.GZipStream and its not maxing out the cpu while loading the gzip data! Im using the default buffer of the stream

Why is its substantialy slower to load 50GB of gzipped file (20GB gzipped 
file) then loading 50GB unzipped data? im using 
System.IO.Compression.GZipStream and its not maxing out the cpu while 
loading the gzip data! Im using the default buffer of the stream that i open 
on the 20GB gzipped file and pass it into the GZipStream ctor. then 
System.IO.Compression.GZipStream takes an hour! when just loading 50GB file 
of data takes a few minutes! 




Return to dotnet framework

 

Who is online

Users browsing this forum: No registered users and 7 guest