Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

BotW got some after-launch patches that significantly reduced the launch time.

They did this by overclocking the Switch during load. I didn't see any mention of boost/overclock in the Factorio writeup, though.



Interesting, I wasn't aware (or forgot)! I looked into it though, and it seems like it wasn't anywhere near the minute+ mentioned in the article.

I found a GameFaqs forum thread[1] from shortly after the game launched that claims it was about 20 seconds. And a US Gamer article[2] that says the update that came a couple years later only shaved off about 5 seconds (seems within the ballpark of what I measured)

[1] https://gamefaqs.gamespot.com/boards/189707-the-legend-of-ze...

[2] https://www.usgamer.net/articles/breath-of-the-wild-load-tim...


Do you know if this technique is available for all games or just Nintendo's own? It seems like it would be easy to cause issues if games were allowed to mess with the clocking willy-nilly, so I assumed that it was only done on Zelda because it's a first-party title (and one of the biggest on the console).


I've used this technique in the past on various shipped Switch games. It's an API that puts the Switch into "CPU Boost Mode".

It overclocks the CPU at the expense of GPU. It's suitable only for loading screens or other areas of the game where you don't need to render at more than single-digit FPS, but can produce a 20-50% improvement in loading time.


Interesting, curious if Factorio is using it then.


I work on the game, and can confirm this technique is used wherever it makes sense!


This article[1] claims Crash Team Racing takes advantage of boost mode, so seemingly it's available to third-party developers.

Perhaps they don't get to mess with the clock "willy nilly" and it's more like an API that enables a higher - but fixed - clock speed?

[1] https://www.shacknews.com/article/112895/crash-team-racing-n...


This is available for any game since 2019, here’s an article talking about it: https://www.eurogamer.net/digitalfoundry-2019-nintendo-switc...


Why manual and not a dynamic allocation of power budget like gaming laptops use? The driver decides which of the CPU or GPU is "starved" and gives the other more power.


One theory: If a game is GPU limited, which most are, it will be at 100% utilization no matter how much power it steals. However the CPU can't be power limited too much. Games have a physics loop that has to runs at a constant rate, independent of rendering. If the CPU is at 100% any disturbance might cause the physics step to not finish in time, and the game crashes...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: