Lund University

Denna sida på svenska This page in English

Tutorial - Povray raytracing (Windows)

Basic work flow for using the resources at Lunarc targeted at a user familiar with a Windows based environment.

Povray raytracing

A simple Povray (raytracer) rendering will be conducted to illustrate how to use the Lunarc systems. The Povray renderer takes two files as input, one  .pov-file describing the scene to be rendered and one .ini-file describing rendering settings such as image size and other global settings.  The output from the Povray raytracer is one or more image files.

Step 1 - Create input files

The input files needed for the rendering will be created in a new directory, povrayjob. A new directory is created either by pressing the [F7] button or by clicking on the button in the lower part of the window.

create directory

A dialog i shown asking for a directory name.

create directory

create dir

The scene description file is created by selecting Files/Edit new file... from the main menu in WinSCP. A dialog is shown asking for a filename. Enter simple.pov and click OK. An editor window is shown in which the contents of the file can be edited. Add the following scene description to the file:


#include "" #include "" sphere {  <0,0,0>, 5  pigment { color rgb <1,1,0> }  finish { phong 0.8 }  texture { Jade } } camera {  location <2,5,-10>  look_at <0,0,0> } light_source {  <0,10,-10>  color rgb <1,1,1> }


Click the save button  in the editor toolbar and close the window. The file view should now show a single file, simple.pov.

To render the file a initialisation file is also needed. Create a file, simple.ini, in the same way as with the file simple.pov. The file should have the following contents:


Antialias=On Display=off Antialias_Threshold=0.1 Antialias_Depth=2 Input_File_Name=simple.pov Width=2048 Height=1536


When the file is created, the file view should show the following contents:

Please note. Files can be created and edited locally and then transferred to the remote system for execution later.

Step 2 - Creating a submit script

To run a job on the Lunarc systems a special script must be written, informing the batch-system about the required resources, estimated length of the job and job notification information. Information that is needed by the batch-system is defined using special #PBS tags i a normal script file.


#/bin/sh # Number of nodes needed #PBS -l nodes=1 # Estimated length of job #PBS -l walltime=1:00:00 # Request that regular output (stdout) and # terminal output (stderr) go to the same file #PBS -j oe  # Send mail when job finishes #PBS -m bea #PBS -M # Make sure that the needed software is available . use_modules module add povray # Change to work directory cd $PBS_O_WORKDIR # Execute rendering povray simple.ini


Save the above script as simple.scr,  in the same way as described earlier. All information needed to submit the job is now available on the remote system.

Step 3 - Submitting the job

Job submission requires the user to log in to the system from a terminal. A terminal session can be started from WinSCP by selecting Commands/Open in PuTTY. From this terminal, jobs can be submitted, monitored and deleted.  A typical login screen is shown below:


***************************************** *                                       * * Welcome to the Lunarc system Docenten * *                                       * ***************************************** Docenten uses modules for software handling. The following commands can be useful:     module list                - Shows loaded modules     module avail               - Show available modules     module add [modulefile]    - adds the module modulefile     module unload [modulefile] - removes modulefile To use modules as above in a PBS submit script, first issue the command . use_modules                                                                    As software is installed on the system more modules will become available. [xxx@docenten xxx]$ _


The last line is called a prompt. It also shows a rectangular cursor providing the name of the current directory, beside user and host name. To submit the povray job we have to enter the povrayjob directory. This is accomplished using the cd command:


[xxx@docenten xxx]$ cd povrayjob [xxx@docenten povrayjob]$ _


To show the contents of the directory the ls command can be used:


[xxx@docenten povrayjob]$ ls simple.ini  simple.pov  simple.scr [xxx@docenten povrayjob]$ _


If more information is needed the ls command can be used with the switch -la.


[xxx@docenten povrayjob]$ ls -la total 2 drwxr-xr-x   2 xxx bm 4096 May 25 15:59 . drwx------  31 xxx bm 4096 May 25 15:16 .. -rw-r--r--   1 xxx bm  121 May 25 12:27 simple.ini -rw-r--r--   1 xxx bm  246 May 25 12:19 simple.pov -rw-r--r--   1 xxx bm  423 May 25 15:16 simple.scr [xxx@docenten povrayjob]$


Submitting the job is done using the qsub command and sepcifying the submit script. When the command is finished, the job number is returned. This can be used later on to query the job status.


[xxx@docenten povrayjob]$ qsub simple.scr 45995.m1 [xxx@docenten povrayjob]$ _


The status of the job can be shown in several ways. The entire queue is displayed using the showq command. The output from this command is divided into three parts, active jobs, idle jobs, and blocked jobs.


[xxxx@docenten povrayjob]$ showq ACTIVE JOBS-------------------- JOBNAME            USERNAME      STATE  PROC   REMAINING            STARTTIME 45689                xxxxxx    Running     1    11:35:40  Tue May 24 09:46:28 45896                xxxxxx    Running     1    13:51:07  Wed May 25 10:01:55 45590                xxxxxx    Running     1    15:40:27  Mon May 23 09:51:15 45957                xxxxxx    Running     1    17:42:40  Wed May 25 13:53:28 45958                xxxxxx    Running     1    17:42:40  Wed May 25 13:53:28 ...  112 Active Jobs     147 of  147 Processors Active (100.00%) IDLE JOBS---------------------- JOBNAME            USERNAME      STATE  PROC     WCLIMIT            QUEUETIME 45975              xxxxxxxx       Idle     1    22:00:00  Wed May 25 15:34:26 45976              xxxxxxxx       Idle     1    22:00:00  Wed May 25 15:38:20 45977              xxxxxxxx       Idle     1    22:00:00  Wed May 25 15:38:44 45978              xxxxxxxx       Idle     1    22:00:00  Wed May 25 15:38:59 45979              xxxxxxxx       Idle     1    22:00:00  Wed May 25 15:39:09 45994              xxxxxxxx       Idle     1  2:12:00:00  Wed May 25 16:03:02 45995                  xxxx       Idle     1    00:05:00  Wed May 25 16:10:47 45980              xxxxxxxx       Idle     1    22:00:00  Wed May 25 15:42:19 45988                xxxxxx       Idle     1  3:08:00:00  Wed May 25 15:52:12 ... BLOCKED JOBS---------------- JOBNAME            USERNAME      STATE  PROC     WCLIMIT            QUEUETIME 44793                xxxxxx       Idle     1  3:08:00:00  Mon May 16 16:15:52 44803                xxxxxx       Idle     1  3:08:00:00  Mon May 16 16:15:53 44804                xxxxxx       Idle     1  3:08:00:00  Mon May 16 16:15:53 44805                xxxxxx       Idle     1  3:08:00:00  Mon May 16 16:15:53 ...


When the job has finished it will no longer show up in the output of showq or qstat. Also, by default, the output from the job will be stored in a file with the same name as the submit script with an added extension including the job number. If we list the job directory after the job has finished it has the following contents:


[xxxx@docenten povrayjob]$ ls -la total 1392 drwxr-xr-x   2 xxxx bm    4096 May 25 16:35 . drwx------  31 xxxx bm    4096 May 26 08:54 .. -rw-r--r--   1 xxxx bm     121 May 25 12:27 simple.ini -rw-r--r--   1 xxxx bm 1391582 May 25 16:35 simple.png -rw-r--r--   1 xxxx bm     246 May 25 12:19 simple.pov -rw-r--r--   1 xxxx bm     423 May 25 15:16 simple.scr -rw-------   1 xxxx bm    4871 May 25 16:35 simple.scr.o45995


The simple.png is the output from the povray renderer and the simple.scr.o45995 contains all status output generated by the jobs (standard output and error).

Step 4 - Downloading results

The generated output files can be transferred back to the local system using the WinSCP program. The finished rendering is shown in the following figure.

Page Manager: