Hacker Newsnew | past | comments | ask | show | jobs | submit | jofla_net's commentslogin

Buyer be warned i think this is extremely brand-dependent.

While i've had generally solid experience with sandisk for almost 20 years and had a few old drives (which i hear are slc-based so its not surprising) hold files for over 5 years no issue, i recently almost lost over 4 years of photos.

I had purchased some lexar drives from costco since they were dual interface (usb A / usb C) about 2 years ago, and it was usefull to just get some pictures off my phone. I usually don't rely on such a setup for long term but as with all things I was delayed tending to it. I figured there were 2 per box so i just copied them twice, and diffed them several times to make sure they were exact copies.

After 24 months, one of the drives had a %95 loss, almost every picture was lost cut-off bottom half or so. The other drive surprisingly seemed fine, though it had been plugged in every 6-9 months I recall, as I wanted to browse it a few times, it seems that this action saved the volume. Upon further inspection the good drive still lost 10 pictures in about 5 thousand, so it wasn't perfect.

Lexar.

https://www.ebay.com/itm/176810492981?chn=ps&_trkparms=ispr%...


> After 24 months, one of the drives had a %95 loss, almost every picture was lost cut-off bottom half or so.

If these are JPEGs with a grey or green lower half, it's likely only a few 16x16 macroblocks are corrupted and you can recover the rest.

This cannot be done programmatically because you have to guess what the average colour of the block was, but it can be worth it for precious pictures.


This is one of the reasons I like the PNG format - it has checksums. You can fix a surprising number of broken files by bruteforce testing plausible errors until the checksum passes.

With JPEG one of the big problems is that the data is Huffman-encoded without any inter-block markers (except maybe Restart Markers, if you're lucky). This means that a single bitflip can result in a code changing length, causing frameshifts in many subsequent blocks and rendering them all undecodable. If you have a large block of missing data (e.g. a 4k-byte sector of zeros), then you have to guess where in the image the bitstream resumes, in addition to guessing the value of the running DC components.


It's surprising that it can't be done programmatically, since "minimize the color difference above and below this super obvious line" seems like it should be a pretty straightforward success criterion.

AI may be the “killer app,” for these kinds of “back up and squint” judgment calls.

yes they were grey

To pile up another anecdote: late in 2018 I put a well used PC with an Intel SSDSCKKW240H6 (240GB SSD in M.2 form factor) in storage and picked it up 5 years later. The SSD was unreadable then. The PC (with a different storage system) still runs (sans the fan-control, which apparently took a beating on first boot after such long dormancy and the CMOS battery is meanwhile depleted).

Really who knows if you're getting a legit drive any longer...

https://www.tomshardware.com/pc-components/ssds/fake-samsung...


Well I just ordered a drive from sandisk.com, hopefully they’re not mailing out counterfeits.

Could it possibly be that it wasn't the drive, but maybe the import application?

https://news.ycombinator.com/item?id=45274277 (Apple Photos corrupts images on import - images truncated)


> Why do these services have to suck so much.

They can do what they please. Its due to the network effects. The tie-ins of tech are so strong, I'd wager that %99 of why they succeed has nothing to do with competency or making a product for the user, just that people are too immobile to jump ship for too many reasons. Its staggering how much stronger this is than what people give credit for. Its as if you registered all your cells with a particular pain medication provider, and the idea of switching pills makes one go into acute neurosis.


Someone needs to reimplement a "clean" version of its functionality: professional networking is too important to be left to the data hoarders/government surveillance cluster of organizations.

Besides, its UX has decayed to a "Facebook for the employed", where John Doe praises himself for mastering a mandatory training at work or taking Introduction to HTML at "Harvard" via Coursera.


Nobody is coming to save us. A federated LinkedIn would be great but will not take over. We just need to stop using these services


The problem is a competitor will never be able to succeed without doing the same thing. Try to compete as a "free" service and you'll have to sell ads, try to charge and you'll never get enough signups to fund the business.


but only one browser


Doesnt surprise me, Seagate is marching to its own drum. My experience defiantly mirrors others' higher than average failure rate as well.

My latest 'fun' experience with them, also, came in the form of an Ironwolf drive which is 'detected' on usb-to-sata interface when plugged in, around %15 of the time. While it starts up consistently on a plain SATA interface. This makes it unusable for what I need. Again, no other drive or MFG ever fails on this usbSata, just the new ironwolf, which it appears is actually for the chineese market, but was sold on newegg, but this is not necessarily seagate's fault, nevertheless.


Hey Flock, show me all toyota tacomas with a Raiders sticker who have passed any check point, in the last 35 minutes.... I dont even care about the license plate.


Thank you for doing this.


Yes, I look at this in a similar vein to the (Eval <--> Appply) Cycle in SICP textbook, as a (Design <--> Implement) cycle.


chains, more like it...


This is gold.

I have rarely had the words pulled out of my mouth.

The percentage of devs in my career that are from the same academic background, show similar interests, and approach the field in the same way, is probably less than %10, sadly.


Both are true, and both should be allowed to exist as they serve different purposes.

Sound engineers don't use lossy formats such as MP3 when making edits in preproduction work, as its intended for end users and would degrade quality cumulatively. In the same way someone working on software shouldn't be required to use an end-user consumption system when they are at work.

It would be unfortunate to see the nuance missed just because a system isn't 'new', it doesn't mean the system needs to be scrapped.


I mostly agree but ...

> In the same way someone working on software shouldn't be required to use an end-user consumption system when they are at work.

I'm worried that many software developers (including me, a lot of the time) will only enable security after exhausting all other options. So long as there's a big button labeled "Developer Mode" or "Run as Admin" which turns off all the best security features, I bet lots of software will require that to be enabled in order to work.

Apple has quite impressive frameworks for application sandboxing. Do any apps use them? Do those DAWs that sound engineers use run VST plugins in a sandbox? Or do they just dyld + call? I bet most of the time its the latter. And look at this Notepad++ attack. The attack would have been stopped dead if the update process validated digital signatures. But no, it was too hard so instead they got their users' computers hacked.

I'm a pragmatist. I want a useful, secure computing environment. Show me how to do that without annoying developers and I'm all in. But I worry that the only way a proper capability model would be used would be by going all in.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: