lunduniversity.lu.se

LUNARC

Lund University

Denna sida på svenska This page in English

Tutorial - Povray raytracing (Unix)

This document describes the basic work flow for using the resources at Lunarc.

Povray raytracing

A simple Povray (raytracer) rendering will be conducted to illustrate how to use the Lunarc systems. The Povray renderer takes two files as input, one  .pov-file describing the scene to be rendered and one .ini-file describing rendering settings such as image size and other global settings.  The output from the Povray raytracer is one or more image files.

Step 1 - Logging in

Logging in to the Lunarc systems is done using SSH. Open a terminal window at your local system. Type the following replacing "username" with the username given by the administrators at Lunarc.

 

[username@localsystem username]$ ssh username@docenten.lunarc.lu.se

 

If you haven't logged into the system before SSH will display the following:

 

The authenticity of host docenten.lunarc.lu.se (xxx.xxx.xxx.xxx) can't be established. RSA key fingerprint is ee:10:65:a9:0f:xx:7d:yy:02:49:xx:33:34:38:34:23. Are you sure you want to continue connecting (yes/no)? 

 

This means that SSH does not recognise the system. Answer "yes" to this question and SSH will remember the system next time you log in. Next a password prompt is shown. Enter your Lunarc password here.

 

xxxx@docenten.lunarc.lu.se's password:

 

If the password is correct message is shown and a command prompt.

 

***************************************** *                                       * * Welcome to the Lunarc system Docenten * *                                       * ***************************************** Docenten uses modules for software handling. The following commands can be useful:     module list                - Shows loaded modules     module avail               - Show available modules     module add [modulefile]    - adds the module modulefile     module unload [modulefile] - removes modulefile To use modules as above in a PBS submit script, first issue the command . use_modules                                                                    As software is installed on the system more modules will become available. [xxxx@docenten xxxx]$

 

Step 2 - Create input files

Create a directory "povrayjob" for holding your job-files.

 

[xxxx@docenten xxxx]$ mkdir povrayjob [xxxx@docenten xxxx]$ cd povrayjob [xxxx@docenten povrayjob]$ _

 

Create scene description file using an editor such as vi, emacs or nano. The contents of the file is as follows:

 

#include "colors.inc" #include "textures.inc" sphere {  <0,0,0>, 5  pigment { color rgb <1,1,0> }  finish { phong 0.8 }  texture { Jade } } camera {  location <2,5,-10>  look_at <0,0,0> } light_source {  <0,10,-10>  color rgb <1,1,1> }

 

To render the file a initialisation file is also needed. Create a file, simple.ini, in the same way as with the file simple.pov. The file should have the following contents:

 

Antialias=On Display=off Antialias_Threshold=0.1 Antialias_Depth=2 Input_File_Name=simple.pov Width=2048 Height=1536

 

Please note. Files can be created and edited locally and then transferred to the remote system for execution later.

Step 3 - Creating a submit script

To run a job on the Lunarc systems a special script must be written, informing the batch-system about the required resources, estimated length of the job and job notification information. Information that is needed by the batch-system is defined using special #PBS tags i a normal script file.

 

#!/bin/sh # Number of nodes needed #PBS -l nodes=1 # Estimated length of job #PBS -l walltime=1:00:00 # Request that regular output (stdout) and # terminal output (stderr) go to the same file #PBS -j oe  # Send mail when job finishes #PBS -m bea #PBS -M john.smith@hello.org # Make sure that the needed software is available . use_modules module add povray # Change to work directory cd $PBS_O_WORKDIR # Execute rendering povray simple.ini

 

Create the above script as simple.scr,  in the same way as described earlier. All information needed to submit the job is now available on the remote system.

Step 3 - Submitting the job

The povrayjob directory should now contain the following files:

 

[xxx@docenten povrayjob]$ ls -la total 2 drwxr-xr-x   2 xxx bm 4096 May 25 15:59 . drwx------  31 xxx bm 4096 May 25 15:16 .. -rw-r--r--   1 xxx bm  121 May 25 12:27 simple.ini -rw-r--r--   1 xxx bm  246 May 25 12:19 simple.pov -rw-r--r--   1 xxx bm  423 May 25 15:16 simple.scr [xxx@docenten povrayjob]$

 

Submitting the job is done using the qsub command and sepcifying the submit script. When the command is finished, the job number is returned. This can be used later on to query the job status.

 

[xxx@docenten povrayjob]$ qsub simple.scr 45995.m1 [xxx@docenten povrayjob]$ _

 

The status of the job can be shown in several ways. The entire queue is displayed using the showq command. If there are available nodes on the system the job will show up in the ACTIVE JOBS section of showq output. If all nodes are used the submitted job will instead end up in the IDLE JOBS section. Waiting in the queue for an available node to execute on. Typical output from showqis shown below:

 

[xxxx@docenten povrayjob]$ showq ACTIVE JOBS-------------------- JOBNAME            USERNAME      STATE  PROC   REMAINING            STARTTIME 45689                xxxxxx    Running     1    11:35:40  Tue May 24 09:46:28 45896                xxxxxx    Running     1    13:51:07  Wed May 25 10:01:55 45590                xxxxxx    Running     1    15:40:27  Mon May 23 09:51:15 45957                xxxxxx    Running     1    17:42:40  Wed May 25 13:53:28 45958                xxxxxx    Running     1    17:42:40  Wed May 25 13:53:28 ...  112 Active Jobs     147 of  147 Processors Active (100.00%) IDLE JOBS---------------------- JOBNAME            USERNAME      STATE  PROC     WCLIMIT            QUEUETIME 45975              xxxxxxxx       Idle     1    22:00:00  Wed May 25 15:34:26 45976              xxxxxxxx       Idle     1    22:00:00  Wed May 25 15:38:20 45977              xxxxxxxx       Idle     1    22:00:00  Wed May 25 15:38:44 45978              xxxxxxxx       Idle     1    22:00:00  Wed May 25 15:38:59 45979              xxxxxxxx       Idle     1    22:00:00  Wed May 25 15:39:09 45994              xxxxxxxx       Idle     1  2:12:00:00  Wed May 25 16:03:02 45995                  xxxx       Idle     1    00:05:00  Wed May 25 16:10:47 45980              xxxxxxxx       Idle     1    22:00:00  Wed May 25 15:42:19 45988                xxxxxx       Idle     1  3:08:00:00  Wed May 25 15:52:12 ... BLOCKED JOBS---------------- JOBNAME            USERNAME      STATE  PROC     WCLIMIT            QUEUETIME 44793                xxxxxx       Idle     1  3:08:00:00  Mon May 16 16:15:52 44803                xxxxxx       Idle     1  3:08:00:00  Mon May 16 16:15:53 44804                xxxxxx       Idle     1  3:08:00:00  Mon May 16 16:15:53 44805                xxxxxx       Idle     1  3:08:00:00  Mon May 16 16:15:53 ...

 

When the job has finished it will no longer show up in the output of showq or qstat. Also, by default, the output from the job will be stored in a file with the same name as the submit script with an added extension including the job number. If we list the job directory after the job has finished it has the following contents:

 

[xxxx@docenten povrayjob]$ ls -la total 1392 drwxr-xr-x   2 xxxx bm    4096 May 25 16:35 . drwx------  31 xxxx bm    4096 May 26 08:54 .. -rw-r--r--   1 xxxx bm     121 May 25 12:27 simple.ini -rw-r--r--   1 xxxx bm 1391582 May 25 16:35 simple.png -rw-r--r--   1 xxxx bm     246 May 25 12:19 simple.pov -rw-r--r--   1 xxxx bm     423 May 25 15:16 simple.scr -rw-------   1 xxxx bm    4871 May 25 16:35 simple.scr.o45995

 

The simple.png is the output from the povray renderer and the simple.scr.o45995 contains all status output generated by the jobs (standard output and error).

Step 4 - Downloading results

The generated output files can be transferred back to the local system using SCP or SFTP. To download the simple.png to the local system enter the following commands on the local system:

 

[xxxx@sigrid xxxx]$ scp xxxx@docenten.lunarc.lu.se:/home/xxxx/povrayjob/simple.png . simple.png                                                  100% 1358KB   4.9MB/s   00:00 [xxxx@sigrid xxxx]$ _

 

If the file rendered correctly it should look something like the image below:

Page Manager: