Linux workflow with blender - first steps...

ElphelVision, DNG Conversion, etc., anything related to Software goes in here
25 postsPage 2 of 31, 2, 3

Re: Linux workflow with blender - first steps...

Postby carlos padial » Sun Aug 26, 2012 10:15 am

i have been testing some exr conversions from elphel. Its a great option inside blender, a little heavier than dng (~15mb/frame)

For now it only work in the compositor, where you can work linear 32bits, but they are programming some fixes to get it working in the VSE (currently the VSE is applying a sRGB 2.2 gamma corretion to all strips... so it's not so fine to work with linear exr...)

You can read about the new development here:
http://wiki.blender.org/index.php/User: ... rModifiers
carlos padial offline
Posts: 98
Joined: Wed May 04, 2011 6:52 pm

Re: Linux workflow with blender - first steps...

Postby Sasha Cohen » Sun Aug 26, 2012 3:06 pm

I've been curious for some time now about how OpenEXR could work as an aquisition format. I'm sure VFX artists would love a camera that can shoot to this too.

Thanks for this great information Carlos!
:D
Sasha Cohen offline
Posts: 208
Joined: Mon Nov 15, 2010 9:40 am
Location: Sydney, Australia

Re: Linux workflow with blender - first steps...

Postby carlos padial » Wed Oct 03, 2012 1:51 pm

Here is the script i am using to convert jp4 into exr files

You need to install qtpfsgui, exiftool and elphel_dng to get it working

Code: Select all

#!/usr/bin/python

'''
usage:
ls *.jpg | parallel -j 8 python jp4-elphel-exr.py {}
'''

import os, sys, subprocess

for infile in sys.argv[1:]:
    f, e = os.path.splitext(infile)
    command = "elphel_dng 100 "+f+e+" "+f+".dng"
    subprocess.call(command,shell=True)

    command = "exiftool -tagsFromFile "+f+e+" "+f+".dng"
    subprocess.call(command,shell=True)   

    command = "exiftool -ISO=100 -FocalLength=4.5 -ExposureTime=0.04 -ApertureValue=2.0 -overwrite_original "+f+".dng"
    subprocess.call(command,shell=True)

    # dcraw -T -4 -q 1 -a -m 2
    # |
    # `-> you have to configure this inside qtpfsgui preferences

    command = "qtpfsgui "+f+".dng -s "+f+".exr"
    subprocess.call(command,shell=True)

    command = "rm "+f+".dng_original"
    subprocess.call(command,shell=True)

    command = "rm "+f+".dng"
    subprocess.call(command,shell=True)



And here is a brief description on how to grade exr with blender and the "Edit with compositor script"



you can find info about the blender script here:
http://blenderartists.org/forum/showthr ... compositor
carlos padial offline
Posts: 98
Joined: Wed May 04, 2011 6:52 pm

Re: Linux workflow with blender - first steps...

Postby flavio » Wed Oct 03, 2012 3:55 pm

Carlos, you have taken RAW video editing in linux to ANOTHER level. I cannot clap your work enough! So many congratulations! We'll shoot at the end of this month, I am reconsidering my workflow after seeing this.

From what I understand of your script:
RAW MOV files -> JPG -> JP4 -> DNG -> EXR

Is this it?

So the workflow would be something like this:

RAW MOV files -> JPG -> JP4

You open the sequences of JP4 and join them with metastrips to edit them in Blender's VSE. Once edited, you take only the frames you used to:

JP4 -> DNG -> EXR

And open the EXR as another layer, covering the base-metastrip you have. Then you debayer and color grade the images in node editor, making things easier with the scripts we see in the video.

Then you render the graded frames back to your VSE timeline for the final render. Is this it?

Question: if what I said is right, then you could also take the DNGs converted with movie2dng to become EXR, is this correct? That meaning: in the case I went on with a Cinelerra workflow for editing, I could still do the color grading in Blender as long as I could connect these two programs - Blender would read my Cinelerra timeline edits in its VSE (if I manage to do tht, of course) and I could then make it point to the EXR files instead of the base JPGs I'm using, color grade in Blender and final render the movie there.

This is my original workflow wish. It's - for me - the best of both worlds in Linux, with the benefit that if people want to do the whole editing in Blender, they can. It will also skip the UFRaw color grading that I'm currently doing, the part I'm quite unsure about because the way you're connecting the dots is way better than mine.

Now, I would like everyone in this forums to realize something: what Carlos is currently doing is what the community has been calling "Cinema DNG" and considering a far wish. Thanks Carlos, what you have proven is that hacking is still a fucking good prank! =)

Please respond to this, because you are the headlines of our next newsletter! =)
flavio offline
Posts: 341
Joined: Fri Nov 19, 2010 12:03 am

Re: Linux workflow with blender - first steps...

Postby carlos padial » Wed Oct 03, 2012 4:04 pm

Question: if what I said is right, then you could also take the DNGs converted with movie2dng to become EXR, is this correct?


of course, you can change the elphel_dng part with movie2dng and keep it going... or simply use ffmpeg -i video.mov -qscale 1 folder/video_%06d.jpg to separate the mov into jp4's

anyway, i'm not using movie2dng becouse i am processing the rendered sequence editted with proxies and so...
carlos padial offline
Posts: 98
Joined: Wed May 04, 2011 6:52 pm

Re: Linux workflow with blender - first steps...

Postby carlos padial » Wed Oct 03, 2012 4:07 pm

another tip to test the workflow is to set r g b gain to 1 1 1 and then color becomes perfect when converting with:

dcraw -T -4 -q 1 -w -m 2

dcraw -T -4 -q 1 -a -m 2 <---- be careful with this one, white balance is managed in auto and you will have flickering problems when changing subjects or light conditions
carlos padial offline
Posts: 98
Joined: Wed May 04, 2011 6:52 pm

Re: Linux workflow with blender - first steps...

Postby flavio » Wed Oct 03, 2012 4:11 pm

carlos padial wrote:
Question: if what I said is right, then you could also take the DNGs converted with movie2dng to become EXR, is this correct?


of course, you can change the elphel_dng part with movie2dng and keep it going... or simply use ffmpeg -i video.mov -qscale 1 folder/video_%06d.jpg to separate the mov into jp4's

No, no, I mean:

Movie2dng -> DNG

Then:

DNG -> EXR

I would skip the JP4 part you're using, and go straight from the DNGs to EXR. Is this possible?
flavio offline
Posts: 341
Joined: Fri Nov 19, 2010 12:03 am

Re: Linux workflow with blender - first steps...

Postby carlos padial » Wed Oct 03, 2012 4:22 pm

keep clear that jp4 are almost the same as jpg Maybe you loose some quality by saving jp4s inside blender, but i dont think so. Its a matter of proportions :)

this is the flow:

raw jp4 mov ----------------------------------====> BLENDER VSE ---------> jp4 sequence
. |______gstreamer proxies____/ |
. ____________________________________________________________/
. /
. \___> jp4-elphel-exr.py _____> exr sequence ____> BLENDER 'Edit strip with Comp' .____
. \
. OPEN SOURCE MOVIE !!! <___________/
.

:)
carlos padial offline
Posts: 98
Joined: Wed May 04, 2011 6:52 pm

Re: Linux workflow with blender - first steps...

Postby carlos padial » Wed Oct 03, 2012 4:25 pm

Try with this one inside the dng folder.

Code: Select all

#!/usr/bin/python

'''

usage:

ls *.dng | parallel -j 8 python dng-elphel-exr.py {}

'''

import os, sys, subprocess

for infile in sys.argv[1:]:
    f, e = os.path.splitext(infile)
   
    command = "exiftool -ISO=100 -FocalLength=4.5 -ExposureTime=0.04 -ApertureValue=2.0 -overwrite_original "+f+".dng"
    subprocess.call(command,shell=True)
   
    # dcraw -T -4 -q 1 -a -m 2
    # |
    # `-> you have to configure this inside qtpfsgui preferences

    command = "qtpfsgui "+f+".dng -s "+f+".exr"
    subprocess.call(command,shell=True)

    command = "rm "+f+".dng_original"
    subprocess.call(command,shell=True)

    command = "rm "+f+".dng"
    subprocess.call(command,shell=True)

carlos padial offline
Posts: 98
Joined: Wed May 04, 2011 6:52 pm

Re: Linux workflow with blender - first steps...

Postby flavio » Wed Oct 03, 2012 4:58 pm

Yeah, I'll definitely test this when I get home! =)

* doubt: why did you put the JP4 inside a metastrip at your project?
flavio offline
Posts: 341
Joined: Fri Nov 19, 2010 12:03 am

PreviousNext
25 postsPage 2 of 31, 2, 3

Who is online

Users browsing this forum: No registered users and 1 guest

cron