or do I simply have to rewrite to handle to manage many many smaller files?
note: the same large file with 50000 lines works file with loadStrings in standard processing however.
Since I am new to java script with processing, any help is greatly appreciated
There is no StringBuffer in JS, and Processing.js simply retrieves your data by using a synchronous "ajax" call, so the data limit is whatever the browsers connection data limit is. Which should pretty much be "whatever your RAM will fit".
However, what you're more likely running into is that loadStrings() is a synchronous operation: if you call it, everything else halts and your page, and all the scripts on it, have to wait for the load operation to finish. This is considered extremely bad by browsers, so they'll usually cut off the script at some point.
If you're loading huge data files, it's much better to not use loadStrings(), but instead do your data loading before you start your sketch, using an asynchronous ajax call with a callback that signals that your data is ready is your sketch can be loaded.