== Running NEMO5 == The best way to use NEMO5 for most users to run simulations will be to submit jobs on the RCAC computing clusters from their nanoHUB workspaces. See "Running NEMO5 Jobs on RCAC from nanoHUB" Student Cluster Competitors should download the source. == Building NEMO5 == Please see the README in the source repository https://nanohub.org/resources/13244 The current manual is here: https://nanohub.org/resources/13112 == Support == Please post all questions to the discussion forum https://nanohub.org/groups/nemo5distribution/forum Unfortunately, developers cannot respond to direct requests for support. We try to keep a blog here with updates: https://nanohub.org/groups/nemo5distribution/blog/2012/03/hello == Input Files == See the regression tests for input files https://nanohub.org/resources/19171 == Running NEMO5 Jobs on RCAC from nanoHUB == !NanoHUB users of NEMO5 can take advantage of the computing power of the Rosen Center for Advanced Computing (RCAC). Users can submit jobs from their nanoHUB workspaces to run on the RCAC clusters. The input files are located in the user's nanoHUB workspace, the job is submitted and run on RCAC, and the output files are written back to the user's nanoHUB workspace. It is not necessary to compile NEMO5 or to download the executable. The materials database and an input file are required. !NanoHUB workspace: http://nanohub.org/tools/workspace Materials database: https://nanohub.org/resources/13606 Sample input files: https://nanohub.org/resources/13410 1. The most recent revision of NEMO5 on RCAC for use by nanoHUB workspace is 11160 (previously 10328) (new version coming Sept. 23rd). Replace 'XXXXX' below with this number. 2. Create a symbolic link to the all.mat materials database file {{{ ln -s /apps/share64/nemo/examples/current/materials/all.mat }}} Regardless of where you have the all.mat file located locally, the "database" line in the input file needs to be {{{ database = ./all.mat }}} 3. Use the submit command to request a job to run on the RCAC machines. You can find out more information about the submit command using: {{{ submit --help }}} 4. A typical job submission command is shown below. This particular example is located in the workspace in /apps/share64/nemo/examples/current/public_examples/bulk_!GaAs_band_structure {{{ submit -v coates -i ./all.mat nemo-rXXXXX GaAs_sp3d5sstar_SO_Klimeck.in }}} Here, the computer system coates is the venue where the job will execute. The materials database (all.mat) is passed with the '-i' flag. The line ends with the name of the nemo executable and the input file. Available venues right now are Coates and Rossmann. Rossmann is a newer system and most likely faster to execute. For instance, if the materials database file is in a separate directory, the command may look something like {{{ submit -v coates -i ./all.mat nemo-rXXXX GaAs_sp3d5sstar_SO_Klimeck.in }}} After the job runs, output files will be written back to your nanoHUB workspace directory where you submitted the job. 5. (Under construction) Larger simulations can be run on multiple cores. The total number of cores '-n' and wall time '-w' can specified. For '-N', Coates has 8 cores per node. See the "Partitioning" section (4.2.4) of the NEMO5 manual. The submit command takes arguments related to the number of processors and walltime required. {{{ submit --help }}} The relavent arguments are {{{ -n NCPUS, --nCpus=NCPUS Number of cores for MPI execution -N PPN, --ppn=PPN Number of cores/node for MPI execution -w WALLTIME, --wallTime=WALLTIME }}} The following job requests 8 cores on Coates for 5 minutes (the example below is located in public_examples/bulk_!GaAs_parallel) {{{ submit -v coates -i ./all.mat -n 8 -w 0:05:00 nemo-rXXXX bulkGaAs_parallel.in }}} This will request 16 cores on two nodes {{{ submit -v coates -i ./all.mat -n 16 -N 8 -w 0:05:00 nemo-rXXXX bulkGaAs_parallel.in }}} If the request is for more than one total node, the request must be for the total number of cores on the nodes. In other words, one cannot have {{{ -n 12 -N 8 }}} However, one can have {{{ -n 1 -N 1 }}} == RCAC Machine Constraints == Job submissions to coates go to the standby queue, which has a maximum walltime of 4 hours. ==Input Deck and Materials Database Editor== For use with Firefox browser [[BR]] https://engineering.purdue.edu/gekcogrp/software-projects/nemo5/InputDeckEditor/ ==Static Binary and Source Code== * The NEMO5 statically compiled executable for use on your local x86 64 bit linux machine can be found here: https://nanohub.org/resources/13117 * If you want to compile from source, it is available: https://nanohub.org/resources/13244 == Developers == '''Current''' * '''Michael Povolotskyi''': (atomistic) geometry generation, FEM (libmesh), Poisson solver (libmesh), quadrature rules (libmesh), degree-of-freedom map, input deck parser, general simulation class and simulation flow, Makefile system [[BR]] * '''Tillmann Kubis''': tight-binding Hamiltonian and passivation, Propagation class [[BR]] * '''Jim Fonseca''': memory management, strain, support * '''Jean Michel Sellier''': Structure design screen of 1d Heterostructure Tool * '''Zhengping Jiang''': effective mass Hamiltonian, resonance finder, RTD simulations '''Past''' * '''Hong-Hyun Park''': transport simulations (NEGF with recursive algorithms, wavefunction formalism) [[BR]] * '''Sebastian Steiger''': material database, matrix classes, PETSc matrices and linear solvers, SLEPc eigensolvers, messaging system, K-space, QHULL tetrahedralizer and Voronoi cell generator, VFF strain solver, VFF phonon solver, most crystal classes, test suite * '''Lang Zeng''': random dopants * '''Denis Areshkin''': 2nd derivative of some VFF energy terms, 2D irreducible wedge finder * '''Arun Goud''': Nanohub interfaces to 1d Heterostructure Tool, Quantum Dot Lab, Brillouin Zone Viewer, and Resonant Tunneling Diode Simulation using NEGF * '''Lars Bjaalie''': Nanohub interface to Quantum Dot Lab * '''Benjamin Haley''': material database (initial version)