For the default simple quantum dot, the bulk band gap is given as 1.43 eV (GaAs), and the value of Eg for the quantum dot is given as 0.18 eV. This is also the energy of the absorption peak. Why such a large difference, and why is the Eg (band gap?) for the quantum dot a smaller value than the bulk band gap?
If you look back through the questions, last year (2016) Shaikh Ahmed also had an issue with the band gap values / way the energy is represented. Another question from a few years back points out that it is not clear where the occupied and unoccupied levels are -- the energy difference for these levels (HOMO-LUMO) should correspond to the Eg value and absorption energy. Another question asks why the current simulation values do not match those in the example done in the supporting materials. Another question asks why the simulated results do not match those done by hand calculations.
Is this part of the tool broken?