NEMO5 distribution and support group
Discoverability: Visible
Join Policy: Invite Only
Created: 29 Jun 2010
NEMO5 Wiki Pages
- Version 116
- by (unknown)
- Version 130
- by (unknown)
Deletions or items before changed
Additions or items after changed
1 | + | [[Tableofcontents()]]
|
||
---|---|---|---|---|
2 | == Citing NEMO5 == | |||
3 | Please see the academic license agreement (esp. sections 1.7 and 2.5.6): | |||
4 | https://engineering.purdue.edu/gekcogrp/software-projects/nemo5/licenses/NEMO5_Software_Academic_License.pdf | |||
5 | ||||
6 | We also kindly ask that you mention NEMO5 by name in the article text. | |||
7 | ||||
8 | - | == Running NEMO5 ==
|
||
9 | ||||
10 | - | It is '''strongly suggested''' for most NEMO5 users to run simulations will be to submit jobs on the RCAC computing clusters from their nanoHUB workspaces. See "Running NEMO5 Jobs on RCAC from nanoHUB".
|
||
11 | ||||
12 | - | == Building NEMO5 ==
|
||
13 | - | |||
14 | - | Please see this document for help with building NEMO5
|
||
15 | - | |||
16 | - | https://docs.google.com/document/d/17chrMIPbwORMSeQSeefcUaIO0ZMZ4VOnkgMExAq4xKQ/edit?usp=sharing
|
||
17 | - | |||
18 | - | The current manual is here: https://nanohub.org/resources/13112
|
||
19 | - | |||
20 | - | Again, it is '''strongly suggested''' that users run NEMO5 through the nanoHUB workspace. Users may ask questions about building on the discussion forum, but we are unable to provide extensive individual support for this.
|
||
21 | - | |||
22 | - | == Support ==
|
||
23 | - | Please post all questions to the discussion forum https://nanohub.org/groups/nemo5distribution/forum
|
||
24 | - | |||
25 | - | Please provide the, input deck, submit command or other execution command, output files if available.
|
||
26 | - | |||
27 | - | You will automatically receive emails of forum posts. To unsubscribe, uncheck the box at the bottom of the forum page
|
||
28 | - | https://nanohub.org/groups/nemo5distribution/forum
|
||
29 | - | |||
30 | - | == Input Files ==
|
||
31 | - | See the regression tests for input files
|
||
32 | - | https://nanohub.org/resources/19171
|
||
33 | ||||
34 | == Running NEMO5 Jobs on RCAC from nanoHUB == | |||
35 | ||||
36 | !NanoHUB users of NEMO5 can take advantage of the computing power of the Rosen Center for Advanced Computing (RCAC). Users can submit jobs from their nanoHUB workspaces to run on the RCAC clusters. The input files are located in the user's nanoHUB workspace, the job is submitted and run on RCAC, and the output files are written back to the user's nanoHUB workspace. It is not necessary to compile NEMO5 or to download the executable. The materials database and an input file are required. | |||
37 | ||||
38 | !NanoHUB workspace: http://nanohub.org/tools/workspace | |||
39 | ||||
40 | Materials database: https://nanohub.org/resources/13606 | |||
41 | ||||
42 | Sample input files: https://nanohub.org/resources/13410 | |||
43 | ||||
44 | - | 1. The most recent revision of NEMO5 on RCAC |
+ | 1. The most recent revision of NEMO5 is 21229 is on RCAC's Conte machine. Replace 'XXXXX' below with this number.
|
45 | ||||
46 | ||||
47 | 2. Create a symbolic link to the all.mat materials database file | |||
48 | {{{ | |||
49 | - | ln -s /apps/share64/debian7/nemo/ |
+ | ln -s /apps/share64/debian7/nemo/rXXXXX/mat/all.mat
|
50 | }}} | |||
51 | ||||
52 | Regardless of where you have the all.mat file located locally, the "database" line in the input file needs to be | |||
53 | {{{ | |||
54 | database = ./all.mat | |||
55 | }}} | |||
56 | ||||
57 | 3. Use the submit command to request a job to run on the RCAC machines. You can find out more information about the submit command using: | |||
58 | ||||
59 | {{{ | |||
60 | submit --help | |||
61 | }}} | |||
62 | ||||
63 | ||||
64 | 4. A typical job submission command is shown below. This particular example is located in the workspace in | |||
65 | /apps/share64/nemo/examples/current/public_examples/bulk_!GaAs_band_structure | |||
66 | ||||
67 | {{{ | |||
68 | - | submit -v |
+ | submit -v conte -i ./all.mat nemo-rXXXXX GaAs_sp3d5sstar_SO_Klimeck.in
|
69 | }}} | |||
70 | - | Here, the computer system |
+ | Here, the computer system conte is the venue where the job will execute. The materials database (all.mat) is passed with the '-i' flag. The line ends with the name of the nemo executable and the input file.
|
71 | ||||
72 | ||||
73 | For instance, if the materials database file is in a separate directory, the command may look something like | |||
74 | {{{ | |||
75 | - | submit -v |
+ | submit -v conte -i ./all.mat nemo-rXXXX GaAs_sp3d5sstar_SO_Klimeck.in
|
76 | }}} | |||
77 | ||||
78 | After the job runs, output files will be written back to your nanoHUB workspace directory where you submitted the job. | |||
79 | ||||
80 | - | 5. |
+ | 5. Larger simulations can be run on multiple cores. See the "Partitioning" section (4.2.4) of the NEMO5 manual. The total number of cores '-n' and wall time '-w' can specified. For '-N', the value should be equal the number of cores per node. See "RCAC Machine Constraints" section below.
|
81 | ||||
82 | The submit command takes arguments related to the number of processors and walltime required. | |||
83 | ||||
84 | {{{ | |||
85 | submit --help | |||
86 | }}} | |||
87 | ||||
88 | The relavent arguments are | |||
89 | {{{ | |||
90 | -n NCPUS, --nCpus=NCPUS Number of cores for MPI execution | |||
91 | ||||
92 | -N PPN, --ppn=PPN Number of cores/node for MPI execution | |||
93 | -w WALLTIME, --wallTime=WALLTIME | |||
94 | }}} | |||
95 | ||||
96 | - | The following job requests |
+ | The following job requests 16 cores on Conte for 5 minutes (the example below is located in public_examples/bulk_!GaAs_parallel)
|
97 | ||||
98 | {{{ | |||
99 | - | submit -v |
+ | submit -v conte -i ./all.mat -n 16 -w 0:05:00 nemo-rXXXX bulkGaAs_parallel.in
|
100 | }}} | |||
101 | ||||
102 | This will request 16 cores on two nodes | |||
103 | {{{ | |||
104 | - | submit -v |
+ | submit -v conte -i ./all.mat -n 32 -N 16 -w 0:05:00 nemo-rXXXX bulkGaAs_parallel.in
|
105 | }}} | |||
106 | If the request is for more than one total node, the request must be for the total number of cores on the nodes. In other words, one cannot have | |||
107 | {{{ | |||
108 | - | -n |
+ | -n 24 -N 16
|
109 | }}} | |||
110 | + | <!--
|
||
111 | However, one can have | |||
112 | {{{ | |||
113 | -n 1 -N 1 | |||
114 | }}} | |||
115 | + | --!>
|
||
116 | ||||
117 | == RCAC Machine Constraints == | |||
118 | - | Job submissions to |
+ | Job submissions to conte go to the standby queue, which has a maximum walltime of 4 hours.
|
119 | + | |||
120 | + | There are currently two venues available.
|
||
121 | + | |||
122 | + | Conte has 16 cores per node.
|
||
123 | + | |||
124 | + | {{{
|
||
125 | + | "conte": standby queue, 4 hours, maximum_cores=TBD
|
||
126 | + | "ncn-hub@conte": 336 hours, maximum_cores=TBD
|
||
127 | + | }}}
|
||
128 | ||||
129 | + | You can specify the queue ("standby@conte") or leave it blank ("conte") and the submit command will try to find a venue for the job
|
||
130 | ||||
134 | ||||
135 | + | == Support Materials ==
|
||
136 | + | These and other materials can be accessed through the 'Resources' tab to the left:
|
||
137 | + | |||
138 | + | NEMO5 2014 Summer Hands-on session materials
|
||
139 | + | https://nanohub.org/resources/21824
|
||
140 | + | |||
141 | + | Regression Tests
|
||
142 | + | https://nanohub.org/resources/19171
|
||
143 | + | |||
144 | + | 2012 Summer School (Some of these tutorials are outdated. They can be used as a general guide to NEMO5, but many input options and features have changed.)
|
||
145 | + | http://www.nanohub.org/resources/14775
|
||
146 | + | |||
147 | + | == Questions ==
|
||
148 | + | |||
149 | + | Please post all questions to the discussion forum https://nanohub.org/groups/nemo5distribution/forum
|
||
150 | + | |||
151 | + | Please provide the input deck, submit command or other execution command, output files if available.
|
||
152 | + | |||
153 | + | You will automatically receive emails of forum posts. To unsubscribe, uncheck the box at the bottom of the forum page
|
||
154 | + | https://nanohub.org/groups/nemo5distribution/forum
|
||
155 | + | |||
156 | + | ===Common Questions===
|
||
157 | + | ==== Python ====
|
||
158 | + | If the simulation stops quickly and the stderr shows "[PythonCache] File not found", your simulation requires one or more python sources which need to be included as part of your submit statement. You can see the required file in the stdout and it will look like this: "[PythonCache]: trying to get file: './MetaPoissonQTBM5.py'"
|
||
159 | + | |||
160 | + | Some python sources are available here. If you find that you need a different one, please post on the forum.
|
||
161 | + | |||
162 | + | https://nanohub.org/resources/21835
|
||
163 | + | |||
164 | + | Put them in the same directory as your input file and pass them to the submit command as shown below.
|
||
165 | + | |||
166 | + | {{{
|
||
167 | + | submit -v conte -i ./all.mat -i metasolver.py -i another_meta_solver_if_needed.py nemo-r19861 InAs_pin_nanowire_TFET.in.
|
||
168 | + | }}}
|
||
169 | + | |||
170 | + | |||
171 | + | == Input Files ==
|
||
172 | + | See the regression tests for input files
|
||
173 | + | https://nanohub.org/resources/19171
|
||
174 | ==Input Deck and Materials Database Editor== | |||
175 | For use with Firefox browser [[BR]] | |||
176 | https://engineering.purdue.edu/gekcogrp/software-projects/nemo5/InputDeckEditor/ | |||
177 | ||||
178 | - | == |
+ | == Building NEMO5 ==
|
179 | - | + | ||
180 | - | + | ||
181 | ||||
182 | * If you want to compile from source, it is available: https://nanohub.org/resources/13244 | |||
183 | ||||
184 | - | + | Please see this document for help with building NEMO5
|
|
185 | - | + | ||
186 | ||||
187 | - | + | https://docs.google.com/document/d/17chrMIPbwORMSeQSeefcUaIO0ZMZ4VOnkgMExAq4xKQ/edit?usp=sharing
|
|
188 | - | + | ||
189 | ||||
190 | + | The current manual is here: https://nanohub.org/resources/13112
|
||
191 | ||||
192 | + | Again, it is '''strongly suggested''' that users run NEMO5 through the nanoHUB workspace.
|
||
193 | ||||
194 | == Developers == | |||
195 | ||||
196 | '''Current''' | |||
197 | * '''Michael Povolotskyi''': (atomistic) geometry generation, FEM (libmesh), Poisson solver (libmesh), quadrature rules (libmesh), degree-of-freedom map, input deck parser, general simulation class and simulation flow, Makefile system [[BR]] | |||
198 | * '''Tillmann Kubis''': tight-binding Hamiltonian and passivation, Propagation class [[BR]] | |||
199 | * '''Jim Fonseca''': memory management, strain, support, testing | |||
200 | * '''Jean Michel Sellier''': Structure design screen of 1d Heterostructure Tool | |||
201 | * '''Zhengping Jiang''': effective mass Hamiltonian, resonance finder, RTD simulations | |||
202 | ||||
203 | '''Past''' | |||
204 | * '''Hong-Hyun Park''': transport simulations (NEGF with recursive algorithms, wavefunction formalism) [[BR]] | |||
205 | * '''Sebastian Steiger''': material database, matrix classes, PETSc matrices and linear solvers, SLEPc eigensolvers, messaging system, K-space, QHULL tetrahedralizer and Voronoi cell generator, VFF strain solver, VFF phonon solver, most crystal classes, test suite | |||
206 | * '''Lang Zeng''': random dopants | |||
207 | * '''Denis Areshkin''': 2nd derivative of some VFF energy terms, 2D irreducible wedge finder | |||
208 | * '''Arun Goud''': Nanohub interfaces to 1d Heterostructure Tool, Quantum Dot Lab, Brillouin Zone Viewer, and Resonant Tunneling Diode Simulation using NEGF | |||
209 | * '''Lars Bjaalie''': Nanohub interface to Quantum Dot Lab | |||
210 | * '''Benjamin Haley''': material database (initial version) |