L-MOUNT Forum

Register a free account now!

If you are registered, you get access to the members only section, can participate in the buy & sell second hand forum and last but not least you can reserve your preferred username before someone else takes it.

Apple M processors and PC monitor

That is why I've always used RAW files since my first 16MP Pentax DSLR in 2012, I nearly lost most of them one time but I recovered them except a lot of them are all jumbled up.

It may be difficult or too much bother to find the lrcat (LR edit files) again but I could re-edit the better ones again with the massively improved capability of modern LR (or some other software) and, for example, the noise reduction now is much more advanced.

One time I had to manually remove spots one-by-one on a pollen covered sensor and hopefully software can automatically fix these again. I read Capture One 23 can do this upon importing and I may buy this in the future when my 'special offer yearly' LR sub ends soon.

I think my 2012 Pentax K-30 camera was 12-bit but that's still absolutely fine compared to having an archive of 8-bit jpegs.

Now I'm using RAW + jpeg as SOOC images on the S5ii are so nice but that's why RAW for archive and future monitors/screens/viewing devices was always a must along with the PP latitude.

Yeah Apple silicon SoC is great, various video codecs have specific hardware decoding helping explain the performance since M1. The later ones may also have ProRes decoders and others but I can't remember, you can look up the specs. x86 chips have been adding hardware codec capability also but they aren'g RISC and behind on the lithogrsphy process, I think the M3 is maybe now 3nm process, maybe 4nm (off the top of my head) and the M1 was 5nm. All this technology originated in the iPhone leaving Intel and x86 behind, meanwhile RISC goes back to the early 80s or so.

Nerd rant over Z04 Carrot
 
Back
Top