[R] Reading in 9.6GB .DAT File - OK with 64-bit R?
    Sarah Goslee 
    sarah.goslee at gmail.com
       
    Fri Mar  9 00:54:29 CET 2012
    
    
  
Hi,
On Thu, Mar 8, 2012 at 6:45 PM, RHelpPlease <rrumple at trghcsolutions.com> wrote:
> Hi Jeff & Steve,
> Thanks for your responses.  After seven hours R/machine ran out of memory
> (and thus ended).  Currently the machine has 4GB RAM.  I'm looking to
> install more RAM tomorrow.
You can't load a 9.6GB dataset into 4GB of RAM.
> I will look into SQLLite3; thanks!
>
> I've read that SQL would be a great program for data of this size (read-in,
> manipulate), but I understand there is a hefty price tag (similar to the
> cost of SAS? [licensing]).  At this time I'm looking for a low-cost
> solution, if possible.  After this data event, a program like SQL would not
> be needed in the future; also, with these multiple data sets to synthesize,
> only a handful are of this size.
sqlite3 is free, and doesn't require you to set up a server and client. More
powerful relational database solutions like MySQL and postgreSQL are
also available for free, but require more effort to configure and use.
> Thanks & please lend any other advice!
Sarah
-- 
Sarah Goslee
http://www.functionaldiversity.org
    
    
More information about the R-help
mailing list