So far, I've only generated the graph for Europe(using Graphhopper). Even just Europe required 128GB of RAM and around 10 hours of computation time (the entire planet would likely need 384GB of RAM). I plan to add North America on a separate Docker container soon though. I started with Europe because I’m familiar with some of the bike trails here, which makes it easier for me to check if the routing makes sense.
> the entire planet would likely need 384GB of RAM
Unlikely. Even with turn costs enabled 256GB (or less) are sufficient. You could also try to disable CH as for bike often no long routes are required (you could disable them). Here we have written down a few more details: https://www.graphhopper.com/blog/2022/06/27/host-your-own-wo...
Hey karussell, I really appreciate all the hard work you’ve put into Graphhopper. I wouldn't be able to create this project without GH. I have a question about memory usage during the import stage (specifically in the OSM Reader's preprocessRelations function). I'm using a HashMap<Long, List<Long>> to map way IDs to OSM bike route relation IDs, which means allocating lots of arrays. Could this be causing me to run out of heap memory faster or am I off base here?
I thought I would be able to compute the graph with 64GB of ram but it kept crushing before CH and LM stage. After switching to a 128GB instance, it finally worked, hitting around 90GB at peak memory usage. For context, I was using 3 profiles - one with CH and two with LM, plus elevation data and used all of the tips from deploy.md
Maybe you already considered, but there are a number of collection libraries out there that are optimized for holding Java primitives and/or for very large sets of data, which could help you save significant memory. Eclipse Collections [0] and Fastutil [1] come to mind first, but there are many out there [2]
Thank you! I'm a total Java noob - actually, this is the first project where I've written any Java code (had to slightly modify the Graphhopper source code to suit my needs). Those libraries look very interesting. I'm saving this post for another battle with processing maany GBs of OSM data :D
We already use carrotsearch internally so you could replace the java util classes like HashMap and HashList with it to reduce memory usage a bit. But it won't help much. E.g. all data structures (in any standard library btw) do usually double their size at some point when their size increases and then copy from the old internal array to the new internal array, which means that you need roughly 3x the current size and if that happens roughly at the end of the import process you have a problem. For that reason we developed DataAccess (inmemory or MMAP possible) which is basically a large List but 1. increases only segment by segment and 2. allows more than 2 billion items (signed int).
Another trick for planet size data structure could be to use a List instead of the Map and the OSM ID as index. Because the memory overhead of a Map compared to a List is huge (and you could use DataAccess) and the OSM IDs for planet are nearly adjacent or at least have not that many gaps (as those gaps are refilled I think).
All these tricks (there are more!) are rather tricky&low level but necessary for memory efficiency. A simpler way for your use case could be to just use a database for that, like MapDB or sqlite. But this might be (a lot) slower compared to in-memory stuff.
Nice landing page. The interactive illustrated calendar is great at showcasing the appeal. Love the idea of being able to look back months at a time and see what was going on in my life. Personally I prefer hosting my own locally but I might steal the emoji idea for my Obsidian journal.
I took a sabbatical that led me to pursue a coaching certification from a university across the country in Washington D.C. It was a wonderful experience and I connected with some great people, many of whom I still keep in close touch with today. I've met great people in tech too, but it was a breath of fresh air to work with people who are energized around helping individuals improve their lives.
I'm back in tech now, working for an AI services startup, but I leveraged my experience to create a role where I can bring coaching within the organization and to our clients through change management coaching.
I was inspired by my mindfulness practice and by Chade-Meng Tan's work at Google, where he used his 20% time to create mindfulness courses within the company. Especially as technological advancement continues to accelerate, there will be a huge need for personal growth initiatives to help people lean into the kinds of skills that AI can't replace: social/emotional intelligence, empathy, creativity, for example.
SGI members chant "Nam Myoho Renge Kyo", which is the name of the Lotus Sutra. The idea behind this short chant is to encapsulate the liberative message of the Sutra and to bring forth the "Buddha Inside".
What is top of mind for me currently, based on observations from my experience: Doubt can be strongly embodied. Meaning, it's not as simple as thinking your way out of it.
As much as I've achieved personally and professionally, when I put myself outside of my comfort zone and into the public arena of judging eyes, I still face doubt.
As much as I can steel myself mentally, remind myself of my accomplishments, analyze my situation, etc. there is a part of me, perhaps as a self-protection mechanism, that creates doubt. That doubt colors my entire experience, influencing my thoughts and actions. Overall it acts as a limiter. I can feel the difference in my body.
I can be aware of all of this, and counteract it to some extent, but it has its own gravity. It's all very impressive in the most literal sense of the word.
So I am learning more about how doubt shows up, and how to work with it beyond simple motivational tricks. For example: Letting go of the need to control it, getting very familiar with the nature of it, and reorienting myself even as I fail to accomplish my goals again and again.
Happy New Year HN. May you find the resilience to overcome whatever doubts are holding you back from living the life you want.