[R] Performance & capacity characteristics of R?
Karsten M. Self
kmself at ix.netcom.com
Tue Aug 3 08:57:38 CEST 1999
I hope this is merely a FAQ, and not an AFAQ (annoyingly....).
I'm a SAS programmer, with several years' experience of the system,
evaluating alternatives. See the SAS for Linux website (URL in sig) for
more info.
I'm exploring R's capabilities and limitations. I'd be very interested
in having a deeper understanding of it capacity and performance
limitations in dealing with very large datasets, which I would classify
as tables with 1 million to 100s of millions of rows and two - 100+
fields (variables) generally of 8 bytes -- call it a 16 - 800 byte
record length.
Can R handle such large datasets (tables)? What are the general
parameters for memory requirements? How great a performance hit does
running to swap (virtual memory) entail? What common
procedures|functions under R use significantly more memory? Are there
guidelines or documentation which point to issues and parameters of
large file|dataset processing under R?
TIA.
--
Karsten M. Self (kmself at ix.netcom.com)
What part of "Gestalt" don't you understand?
SAS for Linux: http://www.netcom.com/~kmself/SAS/SAS4Linux.html
Mailing list: "subscribe sas-linux" to
mailto:majordomo at cranfield.ac.uk
11:45pm up 70 days, 51 min, 1 user, load average: 0.67, 0.38, 0.21
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
More information about the R-help
mailing list