A selection of Nick's scripts
Most of these scripts work just fine on most UNIX flavors, including
Apple's OS X. They were
developed and tested mainly under
linux, with
not-too-old versions of perl and
python.
Most scripts provide help and list their available options when invoked
with the -h option.
None of them is especially smart.
Most of them can help you to corrupt/destroy your data, see at the
bottom...
A few friends and students find them useful, though.
Physics-oriented scripts
- conversion
converts energy units such as eV, J, cm-1, ...
- drude_dielectric_function.py:
generate the dielectric function of a Drude metal over a wide spectral range
- optical_response.py:
taking ω (rad/s) ε1 ε2 (real
& imag parts of the dielectric function) in input, outputs real &
imag part of the refractive index, the normal reflectivity, and the
absorption coeff (in m-1)
- reciprocal_lattice.py:
for a given Bravais lattice, list all reciprocal G vectors of length ≤ maxQ
- phys_const.py:
not really a script: load this code inside a python shell
(
from phys_const import * to have several useful symbols
such as hbar, c, amu, etc. acquire the values of the relevant SI physical
constants
- semiconductor_carrier_concentration.py:
evaluate the chemical potential and carriers concentration for a doped
semiconductor, as a function of temperature
Data-oriented scripts
- add_kink_points
add linearly extrapolated points to kinky-looking 2-column data files
- average_variance_of_slope
evaluate the average slope over slices of an extended 2-column data file
(requires fit)
- fit:
very handy to go through zillions of data files with linear fits
- fixmp3names.py:
organize the filename of mp3 files to fit their tags in a regular format
- gauss_conv:
convolve a spectrum with a Gaussian of given width
- histogram:
generate the histogram of a 1-column data file
- integral:
minimal effort to get some quick approximation to a 1D integration
- interpolate:
simple linear interpolation of 2-column data file
- multifit:
extracts a smoothed slope out of a "noisy" x-y data file
- peaks:
extract all local maxima of a function (typically a spectrum, or a time
series) given as a 2-column file
- primitive:
brutal 1D integration to calculate the primitive function
- prunedata:
cleunup pointlessly long data files with numerically nearly equal points
replace string1 string2 file(s):
does what its name promises... to any number of files!!
- running_average:
evaluate a n-point running average of a 2-columns regularly-spaced data file
- scale:
generate a sequence of equally-spaced numbers
(very similar to the unix utility seq)
- spectrum
visualizes a line spectrum
out of a list of energies (rather primitive, but contains smart pieces of
code, like the autoscale, better than a lot of professional software)
- squarediff
computes the square difference of columns 2,3,...
Useful (together with integral) e.g. to compare quantitatively 2 functions
Tools to analyze/manipulate
.xyz files
- cut_cube_in_xyz:
cut a x-y-z box, optionally specifying a time interval
- cut_sphere_in_xyz:
as above, but for a sphere
- dat2xyz:
generate a .xyz file from a 3-columns numeric text file
- xyz_1atom:
extract the coordinates of 1 atom only (e.g. in order to plot its trajectory)
- xyz_bond_length:
evaluate and report a specific bond length for all frames
- xyz_color_density:
change atomic nature of atoms in regions where the local density is higher
than a given threshold
- xyz_color_lattice:
change atomic nature of atoms depending on their relation to a 2D (xy)
periodic potential function of given lattice parameter
- xyz_color_property.py:
change atomic nature of atoms depending on some property, either given
as a (extra) column of the xyz file or computed as a function of x, y, z
- xyz_delete_atoms:
remove a specific atom from an xyz file
- xyz_delete_close_atoms.py:
in an xyz file, remove atoms which have accidentally bumped into each
other, due to e.g. a merge of 2 xyz files, or to position randomization
- xyz_displacement:
(i) evaluate displacements between successive frames or (ii) change atomic
nature of atoms which move more than a given threshold
- xyz_first_last:
keep only first and last frame
- xyz_gofr.py:
evaluate the 2D or 3D g(r) correlation function averaging over all frames
and taking a periodic box into consideration
- xyz_in_cell:
translate all positions back into one arbitrary primitive cell
(in a loose sense, the converse of xyz_lattice_translate)
- xyz_join.py:
merge (frame by frame) two (or more) xyz files
- xyz_lattice_translate:
replicate the atoms according to translations of a Bravais lattice
- xyz_mark_atom:
change the chemical nature of one specific atom in an xyz file
- xyz_motion_enhance.py:
amplify displacements (by a user-set factor) relative to the linear
interpolation between the first and the last frame
- xyz_rotate.py:
rigidly rotate an xyz file around an axis by an angle
- xyz_select_frame:
choose 1 frame out of a loooong xyz file
- xyz_skip:
keep only one frame every n and discard the rest
- xyz_sphere_density:
evaluate the average number density in a sphere (circle) for particles
given in an xyz file
- xyz_scattering_intensity.py:
evaluate the (neutron scattering) structure factor S(q) for an array of
Dirac deltas at the particles positions provided by an xyz file. To
spare time, compute half q space. You can then use
symmetrize_scattering_intensity.py
to restore the missing half q space.
- xyz_statistics.py:
generate statistical info
- xyz_stretch:
generate an xyz file stretched by a factor (optionally in a given direction)
- xyz_structure_factor.py:
evaluate the X-rays structure factor S(q) for atoms at the
particles positions provided by an xyz file. Useful to compute the
scattering intensity from a polyatomic cell, to be then multiplied by the
Bragg pattern generated by xyz_scattering_intensity.py
- xyz_translate.py:
translate rigidly an xyz file in any direction
- xyz_utils.py:
a python library for reading/writing xyz files
File/shell-oriented scripts
- backup
make a backup: to start it automatically every morning at 1:09,
put a line in crontab (edit it by crontab -e) such as:
9 1 * * 2-6 $HOME/bin/backup >> /tmp/backup.log 2>&1
- crop_pdf,
crop a (1 page!) pdf figure, suppressing useless surrounding white bands
- del,
sort of
rm, but far safer...
- o:
super quick file "open": opens arbitray file(s) and URL(s) with the
correct application following the user's preferences for their specific
media type
- rl:
quick remote login (
ssh -X), guess-completing the host name
from past hints
- supergrep:
search for simultaneous occurrence of several patterns in many files, and
report the matching filenames
- swap, often better/safer than
mv
- synchronizedisks.
2 PC's, e.g. a desktop and a laptop: make sure the same files are on both.
Interactive (you control what you copy where).
Based on rsync
Latex-oriented scripts
- accenti
fix LaTeX accents according to Italian spelling rules
- convert2html
quick & efficient for converting LaTeX into html;
generate fairly clean and compact html;
useful also for LaTeX to .doc/.docx conversion,
since MS-Word /
libreoffice
can import html documents
- convertbiblio or the more
refined convertbiblio.py: wanna
use BibTeX to sort
your
\bibitem bibliography automatically, but never quite
dared to put them in the horrible bibtex format?
- latexcontinuo: Happy LaTeXing in 3
windows: editor +
okular
(or evince
or gv)
+ latexcontinuo (possibly minimized).
Makes kile
lyx and other refined tools
unnecessary
- tabletex:
n columns of data converted into a formatted LaTeX table, ready to
be pasted into a document
Students should also check useful instructions on how
to produce good hypertextual pdf and how
to generate valid pdf/a
with LaTeX.
A quick look at some
Common
Errors in Technical Writing also helps.
Pictures/photography-oriented scripts
- arrange_photo
a nifty tool to organize jpg pictures & avi/mpg movies into folders
according to their EXIF data
- fix_eps_fonts occasional eps files
come with funny fonts which do not display well once included into latex.
This script manages to clear them and make them fully latex-compatible
- fix_jpg_tags use exiftool to
retrieve and correct the exif data from another picture with the same
initial part of the name (e.g. the raw file which originated a jpg
picture)
- images2pdf use
sam2p to convert (several)
bitmapped images into individual pdf files plus one global pdf with one
picture per page
- map_photo.py combine
foxtrotgps and
exiftool to
replace or add geolocalization information to a picture with misplaced or
lacking GPS-localization exif data
- squeeze_eps reduce the size of those
huge eps files, using sam2p
Instructions for unix newbies
To use these scripts...
- fire up a command-line terminal;
-
create the appropriate folder [
mkdir $HOME/bin] (in case it already
exist, no worry, you just get a harmless error message);
-
save the script "
scriptname" in your bin, or save it wherever
your browser wants.
NB! never use cut&paste from the browser screen to download any script: use the browser "Save Page As..." menu command!
-
move it in your bin, e.g. [
mv ~/Downloads/scriptname ~/bin];
-
let your computer know it is executable
[
chmod +x ~/bin/scriptname].
After your log out and log back in "scriptname" should become
a legitimate command, exactly like ls or cp, and
it should run regardless of the folder you are working in.
If it does not work, most likely your path needs fixing,
typically by adding a line of instructions to your ~/.bashrc
file.
The simplest way to add this instruction is executing the command
echo '\nPATH=$HOME/bin:$PATH' >> $HOME/.bashrc .
(Please execute this command once only: repeated executions of this line
produce a dumb sequence of contradictory lines at the end of
"$HOME/.bashrc".
In case you accidentally created them, please edit that file and delete all
but one line PATH=$HOME/bin:$PATH.)
After your next login your "scriptname" should execute fine
from the command line.
Whenever it makes sense, these scripts throw their output to stdout,
i.e. output appears in the terminal window.
Their normal use is therefore
scriptname [options] inputfile [moreinputfiles] > outputfile
For example:
prunedata file_to_be_pruned.dat > pruned_file.dat
In case you forgot to redirect output with > and your
terminal happens to be scrolling zillions of lines, then maybe you'd better
press CTRL-C to kill execution, and repeat your command line with an
appropriate redirection.
A note on redirecting output: command > somefile replaces
whatever content was in somefile with the output
of command. This may lead to the unwarned irreversible
destruction of potentially precious data.
If this makes you feel unsafe (it should!), then
-
backup regularly,
-
test scripts in folders where you keep no precious unduplicated data,
and
- think twice before pressing
enter!
Comments / debugging / patches are welcome!