Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

How different are those detectors? Is it possible they share a common or similar design?


Several of them use multiple as-independent-as-possible detection methods. The most common are ground stations in a grid that measure high energy particles passing through, and then correlate them in time, followed by ultra fast (in terms of data acquisition - eg 20M samples/s) telescopes pointed at the atmosphere (not the sky), which can see the slight trace of fluorescence light left behind by the air shower from up to tens of kilometers away.

What's common to all of the detection methods I know of is the atmosphere. Back at Pierre Auger Observatory, we used a combination of LIDAR scans and weather balloons to constantly monitor it. The atmosphere is basically a calorimeter for these detectors.

The fluorescence detectors (FD) are generally considered the most direct measurement of both primary particle energy because the fluorescence mechanism is relatively simple (somebody is bound to criticize me for saying that), with a proportionality whose constant can be measured in the lab[1]. But there's still models and simulations that go into it. And these fluorescence telescopes can only be operated in moonless nights, so have a duty cycle of only about 10%.

Eg. Auger combines FD with surface detectors (SD) to use simultaneously measured events to calibrate the more indirect energy measurement of the SD and thus make use of their near 100% duty cycle.

[1] Other methods have been tried to measure this fluorescence yield. It's primarily done in the lab: https://arxiv.org/abs/1210.1319 For something completely different: my own masters thesis from a long time ago was an attempt to determine this yield indirectly because we know from detailed bottom up simulations that air shower shape is near universal given primary and energy. If we knew the event geometry (direction) well enough, we could use the ratio of Cherenkov to fluorescence light along the recorded track and fit our longitudinal shower size model using the fluorescence yield as the free parameter. In hindsight that was a lot of fun. I would enjoy that type of work a great deal more today, now that I feel that have less to prove. :)

Edit: formatting


How likely is it this Hacker News discussion is the place where a basic flaw in the experimental design is discovered?


maybe it could happen 4 or so times times in 30 years.


We've had many more events within an order of magnitude of these events' measured/reconstructed primary energy. The spectrum is a power law. While theoretically something completely different could be happening in the atmosphere of you add 2-10x the energy, there's really no indication thereof. Auger also has high energy event detected by multiple detection methods, not just the "standard" water-cherenkov or scintillator ground stations.


> It would need to be something fundamental about the design of these detectors, which has been replicated four independent times, which has completely escaped the notice of physicists for >3 decades


Your parent wasn't suggesting you hadn't considered it, they were asking for more information about the designs. To what extent is / isn't common design mistake the probable cause?


I guarantee you we're not going to discover the problem here on HN by diving into it.


I promise you I am not remotely that ambitious.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: