I used to do this. A long time ago, I had a setup that would sync feeds from BBC, CNN, Janes [0], and a few other places; extract and sanitize [1] the text of the articles for the previous day; and ultimately prepare a nicely typeset, five column, landscape oriented PDF. It took some trial and error to get something usable, but the printed result was a pleasant way to stay informed.
That was many moons ago.
---
[0] Circa 2008, I could find several feeds from Janes that carried meaty non-subscriber extracts.
[1] LaTeX is picky about all sorts of "special" characters that are more common than one might expect.
In many cases, following the links in the item entries will get you full text. Just to get something going quickly, you can do a `links -dump <url>` to get just text and then cut it down with some bash-fu. That said, if you pull a lot from a given source, it's worth figuring out a reliable way of cracking out the title and text so you can format them sensibly.
That was many moons ago.
---
[0] Circa 2008, I could find several feeds from Janes that carried meaty non-subscriber extracts.
[1] LaTeX is picky about all sorts of "special" characters that are more common than one might expect.