Hacker Newsnew | past | comments | ask | show | jobs | submit | canpan's commentslogin

I am missing this part too. I can't really say ever having a problem upgrading go to the latest version. Now with "go fix", a lot of features are even improved automatically.

Yes, QA is important. My code will always "work" in that everything I tested is bug free. But having someone other test, especially someone who knows the service is gold.

But there is also bad QA: The most worthless QA I was forced to work with, was an external company, where I, as developer, had to write the test sheet and they just tested that. Obviously they could not find bugs as I tested everything on the sheet.

My most impressive QA experience where when I helped out a famous Japanese gaming company. They tested things like press multiple buttons in the same frame and see my code crash.


> But there is also bad QA: The most worthless QA I was forced to work with, was an external company, where I, as developer, had to write the test sheet and they just tested that. Obviously they could not find bugs as I tested everything on the sheet.

This was my sole experience at the one place I worked with an internal QA team. They absolutely could never find bugs that devs missed, often mis-marked ones that didn't exist, and failed to find obvious edge cases that did exist.

Multiple devs fired because the CEO believed the QA over the engineering team; if they marked a bug as present, it was the engineer's fault for writing it. If they didn't catch a bug that made it to prod, it was the engineer's fault for not including it in the test plan. They represented nothing but red tape and provided no value.

Good QA sounds great! I'd love to know what that's like someday! It'd be great to have someone testing my code and finding breakages I missed! I'm only slightly (incredibly) bitter about my bad experience with its implementation.


I do think the type of testing where QA just follows pre-generated script has place. But it is about long term regression. The first round absolutely should not find anything. But with complex system it also should find nothing in a year or three or five years... Offloading this to dedicate resource could be useful in certain industries.

I did not think of that. Maybe for some industries, it might make sense. But if I want a regression test, I would probably set it up as automated test. In the case I mentioned above it was the only test beside my own for a new service.

Not really that impressive, that's Testing Quick Attacks 101

String views were a solid addition to C++. Still underutilized. It does not matter which language you are using when you make thousands of tiny memory allocations during parsing. https://en.cppreference.com/w/cpp/string/basic_string_view.h...

The issue with retrofitting things to an existing well established language is that those new features will likely be underutilized. Especially in other existing parts of the standard library, since changing those would break backwards compatibly. std::optional is another example of this, which is not used much in the c++ standard library, but would be much more useful if used across the board.

Contrast this with Rust, which had the benefit of being developed several decades later. Here Option and str (string views) were in the standard library from the beginning, and every library and application uses them as fundamental vocabulary types. Combined with good support for chaining and working with these types (e.g. Option has map() to replace the content if it exists and just pass it along if None).

Retrofitting is hard, and I have no doubt there will be new ideas that can't really be retrofitted well into Rust in another decade or two as well. Hopefully at that point something new will come along that learned from the mistakes of the past.


Retrofitting new patterns or ideas is underutilized only when it is not worth the change. string_view example is trivial and anyone who cared enough about the extra allocations that could have happened already (no copy-elision taking place) rolled their own version of string_view or simply used char+len pattern. Those folks do not wait for the new standard to come along when they can already have the solution now.

std::optional example OTOH is also a bad example because it is heavily opinionated, and having it baked into the API across the standard library would be a really wrong choice to do.


Existing APIs for file IO in STL don't return string views into the file buffer of the library (when using buffered IO). That is something you could do, as an example.

Optional being opinionated I don't think I agree with. It is better to have an optional of something that can't be null (such as a reference) than have everything be implicitly nullable (such as raw pointers). This means you have to care about the nullable case when it can happen, and only when it can happen.

There is a caveat for C++ though: optional<T&> is larger in memory than a rae pointer. Rust optimises this case to be the same size (one pointer) by noting that the zero value can never be valid, so it is a "niche" that can be used for something else, such as the None variant of the Option. Such niche optimisation applies widely across the language, to user defined types as well. That would be impossible tp retrofit on C++ without at the very least breaking ABI, and probably impossible even on a language level. Maybe it could be done on a type by type basis with an attribute to opt in.


I work on a codebase which is heavily influenced by the same sentiment you share wrt optional and I can tell you it's a nightmare. Has the number of bugs somehow magically decreased? No, it did not, as a matter of fact the complexity that it introduces, which is to be honest coupled along with the monadic programming patterns which are normally enforced within such environments, just made it more probable to introduce buggy code at no obvious advantage but at the great cost - ergonomics, reasoning about the code, and performance. So, yeah, I will keep the position that it is heavily opinionated and not solving any real problem until I see otherwise - the evidence in really complex C++ production code. I have worked with many traditional C and C++ codebases so that is my baseline here. I prefer working with latter.

Niche optimizations are trivial to automate in modern C++ if you wish. Many code bases automagically generate them.

The caveat is that niche optimizations are not perfectly portable, they can have edge cases. Strict portability is likely why the C++ standard makes niche optimization optional.


> optional<T&>

This is a C++26 feature which will have pointer-like semantics, aren't you confusing it with optional<reference_wrapper<T>> ?


C# gained similar benefits with Span<>/ReadOnlySpan<>. Essential for any kind of fast parser.

Swift too, in 6.3!

In C you have char*

Which isn't very good for substrings due to the null-termination requirement.

strtok() happily punches those holes in. Now you could argue, the resulting strings, while null-terminated, aren't true substrings (as the origin string is now corrupted), but in the context of parsing (particularly here using white-space as delimiter), that wouldn't be much an issue.

Struct Substring { char start, end };

My point is ownership being transferred implicitly in a struct assignment is a complexity introduced by C++.

In C the concern of allocating memory and using it is separate.

String_view is attempt to add more separation. But C programmers were already there.


Well yeah, and you could always do the same thing in C++, but having a standard structure for it makes interoperability a lot easier.

An object of type char * isn't necessarily null-terminated.

No, but that makes it no longer a string as far as most C functions are concerned.

And the type system does not tell you if you need to call free on this char* when you’re done with it.

Correct. Haphazardly passing ownership of individual nodes around is a C++ and OOP anti-pattern.

In C you only have char*.

You can compose char* in a struct.

wchar exists.

(And the possibility to implement whatever you want, ofc.)


Not a specialist, just from what I heard: There are two things that make it work. First they are not really "independent" like the title says. They sync with the grid frequency. If the grid is down they shut off for safety. The other reason it works is that the grid power inside the home is just what you get as incoming power 〜230V. For example, I think in the US you get 240V or so delivered to your house, but 120V from the plug.

Very nice! In Tokyo we had this mural some parts were animated using screens in the wall. https://www.teamlab.art/w/skytreemural/tokyoskytree/

I get the feeling. The 90s in particular, maybe even until crysis went super fast. Then tech felt incredibly stagnant for over a decade.

But the time since 2020 feels much faster again. It's scary! But it's exciting.


For your linux distro needs, debian still provides the base iso and you can make a second disk with packages you need and apt-offline.

Well the point of this is to have a distro that contains "everything" (or at least a large number of stuff) since i can't know ahead of time what i'd need.

I think it is still possible to use jigdo to make Bluray disks, but i do not have a bluray drive :-P


I travel a lot and do the same. Yes most places have internet. But I don't need much. And it's easier to have an "offline" folder with docs you need compared to carrying around a satellite dish. Also works in an airplane.

Mine contains language, library and game engine docs. Sometimes I back up some sites completely. But it's getting harder to do that as many sites block crawling now.


Great video. I think both ega and vga look good, depending on the scene (I prefer ega backgrounds but vga close up).

The music however, floppy is best and the cd version is the worst. I played with the internal speaker myself. The cd music sounds off to me, but cannot pinpoint why exactly.

Cga seems to be 1-to-1 conversion of ega. It only looks bad because of the strong cyan and magenta. But thats a hardware limitation not an artistic choice.


> Cga seems to be 1-to-1 conversion of ega

I'm not sure. The dithering is obviously different, not only harsher but in different places in many scenes. Also, the splash screen doesn't have scrolling clouds in the CGA version. And there are other subtle changes.

Call me weird but there's a certain charm to the CGA version, though it's obviously the worst of them. My favorite is the EGA version.


JST here, it' basically add day.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: