Spent the last two days completely rebuilding our ATX case testing & reviews methodology. We actually began taking notes for this overhaul back in November of 2022, then started seriously working on it early in 2023 (which is why we paused ATX case reviews). There are a lot of massive overhauls and improvements to the testing to modernize it, introduce new types of data presentation, further improve accuracy and repeatability, and upgrade a lot of limitations of the approach that date back to 2016 on the old ATX testing methods. In fact, ATX case testing methodology was the first that I built with intent of others on the team executing on way back then, so there was a lot of room to grow. Really excited for these changes. No ETA yet on the first review to drop with it, but we will have a full long-form methodology video that'll establish and define the approach for years to come. Some quick notes on improvements: - We are using the acoustic chamber for this! All future ATX case reviews will have acoustic testing done in the hemi-anechoic chamber. For reference, the old method for it dated back to when we were still running the tests out of a spare room in the old house! - We've moved to a flow-through style video card and have studied its performance both blocked and unblocked to better understand the impact to CPU thermals. This seems to be the direction of most modern GPUs, along with thicker cards, so all of that is reflected in the new bench! Should help differentiate cases further. - We are increasing the focus on noise-normalized testing, which was a bolted-on test to the old methodology that has now been fully defined and fleshed-out - Benchmark charts now have more thermal numbers we can present. We are now logging VRM thermals, additional video card thermals, and more areas of the CPU than before. This will allow us to better itemize the impact to thermals from each case layout (for example, certain top fan configurations could benefit or hurt VRM thermals). - Tests will be conducted with even more test passes (and better logging) now, giving more precision on the results and better error checking capabilities - For internal use, we're improving data export and logging capabilities. As far as external visibility into that, it basically means we'll have a lot more easily accessible ways to visualize data or pull interesting results and present them for you all. - We also have a lot of internal data for calibration of the bench over time to help avoid concerns of 'result drift' with time. TONS more going on, but that's enough for now. This will be a work in progress for probably a month or two, but we're moving quickly on it now and putting together a methodology piece to fully define everything. As always, the test methods are a living subject matter and will get updated with discoveries, ideas we have, and requests from the audience -- but this is a strong start that really takes what we learned from our CPU cooler overhaul and our old case reviews methods, then builds on it with massive improvements we've worked on over the many years of testing cases. Looking forward to sharing more, but wanted to give a brief preview! Which (relatively) new cases do you want to see tested first? Oh, we'll also do a full send-off for the old methodology when it's heading out the door!