In may, a part of the apertus° team gave an AXIOM presentation at the Louis Lumière cinema school. This is their report:
It’s a unique school where - as you may call it - they teach “image architecture” (a mix of cinematographer and engineer). We made a two part presentation about open source and its benefits for filmmakers, the AXIOM camera roadmap, followed by a hands-on workshop with the Alpha prototype.
We received great feedback from the attendees. It’s really interesting to notice that the open source philosophy is more and more known and appreciated by the people we talk to. We get less and less people saying "you are crazy", "this will never work", "why would I need this?". There's more and more enthusiasm these days and increased awareness of the usefulness of being open. We even get support from the students (one of them is already thinking of taking AXIOM as a thesis subject), and their teachers.
We also had the joy of visiting the school’s colorimetry lab. Probably one of the most sophisticated ones in Europe. They have all the colorimetry tools a camera maker might dream of. Here is the explanation (as far as I understood) of the most interesting tool, for camera characterisation:
Here you can see the device that generates accurate color patches to be used to test a camera. You first have a source of light with a very wide spectrum (arc light for visible spectrum, plus a UV lamp and a IR lamp for the spectrum we humans can not see). This light passes onto a holographic beamsplitter. The resulting light is projected onto a DLP array that is then sent onto a reflexion sphere using optical fiber. This device generates a controlled beam of light with accurate wavelengths (in 1nm steps), which is sent to the camera and a spectrometer for real time comparison. The camera's signal recorder can be used to analyze and evaluate how far what the camera outputs is from "reality" for each color patch. By measuring a few hundred colors, it's possible to build a very accurate profile.
This lab already tested most cameras currently on the market, and they are very interested in helping us calibrate the AXIOM in a scientific way! After all, this is the first camera that enables you to compare the light you send in with light that the sensor sees directly - it's not just a black box! AXIOM exposes all the internal controls and parameters in the image pipeline in a completely transparent way to the user. This is the first time that this lab can study how changing different parameters from the image sensor on to all other hardware and software will affect the other steps of internal processing and ultimately the results of the digital signal at the output.
Tons of questions remain, and having an open source camera might at some point bring answers.
- Is every image sensor unique in terms of color rendition? (they noticed that not all camera of the same brand and even in the same batch from a well known manufacturer have the same color rendition).
- How are those camera calibrated?
- Do manufacturers even have the proper tools at hand to fully calibrate each unit they ship or do they just calibrate a handful of primaries?
- How are proprietary cameras calibrated?
- How can we enhance our calibration process?
- Can we tweak the sensor settings to find the best colour rendition?
- Can we automate this to test a lot of different parameters?
Having a completely open camera with scriptable parameters will probably open new test’s opportunities.
We all were very excited to imagine the potential of all this. Exciting times ahead!