The first immediate thing I would recommend is moving all of your files into AWS S3: http://aws.amazon.com/s3/
Storage is super cheap, and you can get rid of the clutter on your laptop. I wouldn't recommend moving to a database yet, especially if you don't have any experience working with them before. S3 has great connector libraries and good integrations with things like Spark and Hadoop and other 'big data' analysis tools. I would start to go down that path and see which tools might be best for analyzing text files from S3!
AWS-stuff is also a new tool that first has to be learned, and frankly it sounds like overkill for this kind of problem (not to mention that it probably comes with its own kind of overhead).
lsiebert's comment about not generating such large text files in the first place is a good one: If you already have a tool that generates the text file, and you already have the tool that analyzes it, it should be possible to just run them simultaneously via some form of pipe, so that the intermediate result never has to be stored.
Finally, if this is not possible for some unclear reason and depending on the kind of space and throughput that really is required, simply getting an external disk may be the better way to go.
Storage is super cheap, and you can get rid of the clutter on your laptop. I wouldn't recommend moving to a database yet, especially if you don't have any experience working with them before. S3 has great connector libraries and good integrations with things like Spark and Hadoop and other 'big data' analysis tools. I would start to go down that path and see which tools might be best for analyzing text files from S3!