[R] Problem with memory for large datasets
    Uwe Ligges 
    ligges at statistik.uni-dortmund.de
       
    Wed Sep 24 17:53:13 CEST 2003
    
    
  
ZABALZA-MEZGHANI Isabelle wrote:
> Hello,
> 
> I would like to know if there is a possibility to "clean" the R memory
> during a R session. In fact, I realize a lot of instruction with large
> objects (matrix of 500*5000), and I can not manage to achieve the end of my
> script due to memory lack. Of course, I've tried to remove all "temporary
> object" during the script execution and to perform a garbage collector ...
> But it seems to have no effect ...
> 
> Any idea to solve this problem without an exit from R ?
> 
> Regards
> 
> Isabelle
After you have removed unnecessary objects, the only thing you can do is 
to increase the memory limit R uses (given you are on Windows). See 
?memory.limit for details.
Attention: raising it will cause your system to begin swapping heavily.
The best solution is to buy some more memory and/or optimize your code 
(given that's possible).
Uwe Ligges
    
    
More information about the R-help
mailing list