• python: Used for data processing (parsing and analysis) and visualization (typically Matplotlib generated plots) as well as a large amount of the data generation through simulation. When the system can described completely in terms of linear algebra (typical of non-interacting models), then the bulk of the computation time is spent in the underlying BLAS and LAPACK libraries. This allowed multicore processing of expensive matrix operations within the python interpreter (in this case I compiled numpy using the Intel MKL). Access to numerical integration, curve fitting, etc. was then provided via SciPy. line_profiler is very useful for finding bottlenecks in heavy numerical code. A typical experimental workflow would be something like (define ranges of parameters and ready them for job submission) → (push job prerequisites to cluster) → (more general python library to handle bulk computation) → (heavy data analysis) → (pull data from cluster) → (light data analysis and plotting), with each of these automated as a chain of dependencies and governed by their own set of executables. In an ideal world, this would be automated end-to-end so one command could take it from simulation to publication, but some of these steps still required manual intervention from time to time.

  • For situations where speed was critical (usually the case for many-body type simulations), a compiled language–C/C++ or cython–was used to increase performance. In particular, ITensor was invaluable for implementing and simulating tensor networks. Here is where I’ve also used ScaLAPACK and raw MPI for the occasional problem that didn’t fit into the above.

  • Model prototyping was usually done in Mathematica, both to find analytic solutions for reduced problems (where the algebra system is used as check on the implementation) and for lightweight numerical processing (to define a set of test cases to the final implementation). Additionally, I used it to explore various approximations of analytic forms that would be tedious to calculate by hand, though any exploratory data visualization was usually easier done in IPython.

  • Document generation was handled by Tex/Latex. For each paper/presentation, a “final” python script was used to generate the publication-ready images (with analysis separated out if lengthy), which were referenced directly by the finished document. Occasionally Abode Illustrator or Inkscape was used to stitch figures together or create a diagram, but the majority of figures were generated from code/text. This was especially helpful for tracking changes and managing templates. The Beamer class was used to generate slides in the same manner.

  • As for the nuts-and-bolts, the usual suspects are here: git (naturally) was used for version tracking of all of the above, computation resources were accessible via PBS or SLURM queueing system, so some bash scripts are used for arrays of simulation submission (not to mention use as local shell), pip and venv for keeping dependencies in line (in C/C++ they were usually simple enough to be compiled statically), ssh/scp for communication/data management, make/cmake/scons for a build system, etc. I usually develop in emacs but I’m not terribly attached to it.

  • Other things I’ve had some experience with in my research career: Julia (at the time it was very early in its development), R (which was a little rough with complex matrix exponentials), MATLAB (did a lot of early work here but ran into licensing issues and Octave was very slow at the time), Fortran (inherited simulation code), AWS (using spot instances to price out certain calculations) and blender (diagrams). I’ve also used NAMD and GROMACS for molecular dynamics simulations, usually in the context of analyzing completed simulations (with VMD used for visualization). I’ve done some light work in density functional theory using VASP and WIEN2k.

  • Coursework was usually done in one of python, C/C++, or Java, with my computer science senior project being written in django. I also worked as an undergraduate writing and supporting LabVIEW instrumentation.

  • Although python is what I’m most comfortable with, I’ve tried out a number of things as a hobby/side project. Embedded development (initially through Arduino, then Photon, then STM development boards): I have a hacked together “home automation” system that uses a Raspberry Pi together with an assortment of microcontrollers talking to an MQTT broker. At various times I’ve dabbled in web development with various combinations of node.js, flask, ruby on rails, react, and jekyll. I’ve done introductory online courses on Haskell, pytorch/keras, and scikit-learn and am working towards getting those skills “production ready”.

Work Experience

Postdoctoral Research Associate (2016 - 2017), Graduate Student Researcher (2015 - 2016)

National Institute of Standards and Technology/University of Maryland - Worked at the Center for Nanoscale Science and Technology on electron transfer processes in sensing, catalysis, and nanoelectronics, as well as developed computational methodology for complex systems.

Graduate Research Assistant (2011 - 2015)

Oregon State University - Investigations of dynamics of quantum systems: fermionic transport in interacting systems, numerical methods of time evolution (time-dependent density matrix renormalization group [tDMRG], matrix product states), open quantum systems, quantum to classical transitions in non-equilibrium steady states.

Teaching Assistant (2010 - 2015)

Oregon State University - Teaching assistant for a broad range of physics courses: general physics with and without calculus, upper division programs, online-based astronomy sequence, and graduate level solid state physics.


Please check my Google Scholar page for an updated and sortable list of publications.


D. Gruss, C.C. Chien, J. Barreiro, M. Di Ventra, and M. Zwolak. An energy-resolved atomic scanning probe,
New J. Phys. 20 (2018) 115005, arXiv preprint arXiv:1610.01903

D. Gruss, A. Smolyanitsky, and M. Zwolak. Graphene deflectometry for sensing molecular processes at the nanoscale,
arXiv preprint arXiv:1804.02701

J. Elenewski, D. Gruss, and M. Zwolak, Communication: Master equations for electron transport: The limits of the Markovian limit,
J. Chem. Phys. 147, 151101 (2017), arXiv preprint arXiv:1705.00566

D. Gruss, A. Smolyanitsky, and M. Zwolak. Communication: Relaxation-limited electronic currents in extended reservoir simulations,
J. Chem. Phys. 147, 141102 (2017), arXiv preprint arXiv:1707.06650

D. Gruss, K. A. Velizhanin, and M. Zwolak. Landauer’s formula with finite-time relaxation: Kramers’ crossover in electronic transport,
Sci. Rep. 6, 24514 (2016), arXiv preprint arXiv:1604.02962

C.C. Chien, D. Gruss, M. Di Ventra, and M. Zwolak. Interaction-induced conducting-nonconducting transition of ultra-cold atoms in 1d optical lattices,
New J. Phys. 15 (2013) 063026, arXiv preprint arXiv:1203.5094


Doctoral Dissertation: Quantum and Classical Simulation of Electronic Transport at the Nanoscale
ScholarsArchive@OSU, PDF, 2016.

Honors Undergraduate Thesis: Applied Computing Techniques for Holographic Optical Tweezers
OSU Library Collections, 2010.



Zwolak Research Group


Oregon State University, Fall 2010 - Spring 2016

College of Science, Doctor of Philosophy Physics

Oregon State University, Winter 2007 - Spring 2010

College of Science, H.B.S. Computational Physics, H.B.S. Physics (Applied Option),
College of Engineering, H.B.S. Engineering Physics,
Computer Science Minor, Summa Cum Laude,