= NEMO5 Wiki Pages = == Getting NEMO5 == * The preferred way to use NEMO5 will be for users to submit jobs on the RCAC computing clusters from their nanoHUB workspaces. See "Running NEMO5 Jobs on RCAC from nanoHUB" * The NEMO5 statically compiled executable for use on your local x86 64 bit linux machine can be found here: https://nanohub.org/resources/13117 * If you want to compile from source, it is available: https://nanohub.org/resources/13244 == Using NEMO5 == * The current manual is here: https://nanohub.org/resources/13112 * Please check the Discussion section and post questions there == Running NEMO5 Jobs on RCAC from nanoHUB == NanoHUB users of NEMO5 can take advantage of the computing power of the Rosen Center for Advanced Computing (RCAC). Users can submit jobs from their nanoHUB workspaces to run on the RCAC clusters. The input files are located in the user's nanoHUB workspace, the job is submitted and run on RCAC, and the output files are written back to the user's nanoHUB workspace. It is not necessary to compile NEMO5 or to download the executable. The materials database and an input file are required. Materials database: https://nanohub.org/resources/13606 Sample input files: https://nanohub.org/resources/13410 1. The most recent supported revision of NEMO5 is r7031. Replace 'XXXX' below with this number. 2. Use the submit command to request a job to run on the RCAC machines. You can find out more information about the submit command using: {{{ submit --help }}} A typical job submission command is shown below. This particular example in included in the public_examples resource in public_examples/bulk_GaAs_band_structure {{{ submit -v coates -i ./all.mat -i GaAs_sp3d5sstar_SO_Klimeck.in nemo-rXXXX GaAs_sp3d5sstar_SO_Klimeck.in }}} Here, the computer system coates is the venue where the job will execute. The two required files are the input file (SiGe_structure.in) and the materials database (all.mat) are passed with the '-i' flag. The line ends with the name of the nemo executable and the input file. Make sure you use the most up to data materials database file in Resources. Available venue right now is coates The all.mat file can be found in the Resources https://nanohub.org/resources/13606 Regardless of where you have the all.mat file located locally, the "database" line in the input file needs to be {{{ database = ./all.mat }}} For instance, if the materials database file is in a separate directory, the command may look something like {{{ submit -v coates -i ../../materials/all.mat -i GaAs_sp3d5sstar_SO_Klimeck.in nemo-rXXXX GaAs_sp3d5sstar_SO_Klimeck.in }}} After the job runs, output files will be written back to your nanoHUB workspace directory where you submitted the job. 3. (Under construction) Larger simulations can be run on multiple cores. The total number of cores '-n' and wall time '-w' can specified. For '-N', Coates has 8 cores per node. See the "Partitioning" section (4.2.4) of the NEMO5 manual. The submit command takes arguments related to the number of processors and walltime required. {{{ submit --help }}} The relavent arguments are {{{ -n NCPUS, --nCpus=NCPUS Number of processors for MPI execution -N PPN, --ppn=PPN Number of processors/node for MPI execution -w WALLTIME, --wallTime=WALLTIME }}} The following job requests 8 cores on Coates for 5 minutes (the example below is located in public_examples/bulk_GaAs_parallel) {{{ submit -v coates -i ../../materials/all.mat -i bulkGaAs_parallel.in -n 8 -w 0:05:00 nemo-rXXXX bulkGaAs_parallel.in }}} == Developers == '''Current''' * '''Michael Povolotskyi''': (atomistic) geometry generation, FEM (libmesh), Poisson solver (libmesh), quadrature rules (libmesh), degree-of-freedom map, input deck parser, general simulation class and simulation flow, Makefile system [[BR]] * '''Tillmann Kubis''': tight-binding Hamiltonian and passivation, Propagation class [[BR]] * '''Jim Fonseca''': memory management, strain, support * '''Jean Michel Sellier''': Structure design screen of 1d Heterostructure Tool * '''Zhengping Jiang''': effective mass Hamiltonian, resonance finder, RTD simulations '''Past''' * '''Hong-Hyun Park''': transport simulations (NEGF with recursive algorithms, wavefunction formalism) [[BR]] * '''Sebastian Steiger''': material database, matrix classes, PETSc matrices and linear solvers, SLEPc eigensolvers, messaging system, K-space, QHULL tetrahedralizer and Voronoi cell generator, VFF strain solver, VFF phonon solver, most crystal classes, test suite * '''Lang Zeng''': random dopants * '''Denis Areshkin''': 2nd derivative of some VFF energy terms, 2D irreducible wedge finder * '''Arun Goud''': Nanohub interfaces to 1d Heterostructure Tool, Quantum Dot Lab, Brillouin Zone Viewer, and Resonant Tunneling Diode Simulation using NEGF * '''Lars Bjaalie''': Nanohub interface to Quantum Dot Lab * '''Benjamin Haley''': material database (initial version)