A selection of Nick's scripts
Most of these scripts work just fine on most UNIX flavors, including
Apple's OS X. They were
developed and tested mainly under
not-too-old versions of perl and
Most scripts provide help and list their available options when invoked
with the -h option.
None of them is especially smart (and, by sufficiently bad abuse, most of
them can corrupt/destroy your data...), but a few friends and students find
add linearly extrapolated points to kinky-looking 2-column data files
evaluate the average slope over slices of an extended 2-column data file
make a backup: to start it automatically every morning at 1:09,
put a line in crontab (edit it by crontab -e) such as:
9 1 * * 2-6 $HOME/bin/backup >> /tmp/backup.log 2>&1
converts energy units such as eV, J, cm-1, ...
rm, but far safer...
very handy to go through zillions of data files with linear fits
organize the filename of mp3 files to fit their tags in a regular format
convolve a spectrum with a Gaussian of given width
generate the histogram of a 1-column data file
minimal effort to get some quick approximation to a 1D integration
simple linear interpolation of 2-column data file
taking ω (rad/s) ε1 ε2 (real
& imag parts of dielectric function) in input, outputs real &
imag part of the refractive index, the normal reflectivity, and the
absorption coeff (in m^-1)
extracts a smoothed slope out of a "noisy" x-y data file
- o, short for "open": opens arbitray file(s) and
URL(s) according to the user's preferences for the mime types
extract all local maxima of a function (typically a spectrum, or a time
series) given as a 2-column file
cleunup pointlessly long data files with numerically nearly equal points
fast remote login (
ssh -X), guess-completing the host name
from past hints
compute all reciprocal G vectors of length <= maxQ for a given Bravais lattice
replace string1 string2 file(s):
does what its name promises... in any number of files!
evaluate a n-point running average of a 2-columns regularly-spaced data file
generate a sequence of equally-spaced numbers
(very similar to the unix utility seq)
visualizes a line spectrum
out of a list of energies (rather primitive, but contains smart pieces of
code, like the autoscale, better than a lot of professional software)
compute square difference of columns 2,3,...
Useful (together with integral) e.g. to compare quantitatively 2 functions
search for simultaneous occurrence of several patterns in many files, and
report the matching filenames
- swap, often better/safer than
2 PC's, e.g. a desktop and a laptop: make sure the same files are on both.
Interactive (you control what you copy where).
Based on rsync
cut a x-y-z box, optionally specifying a time interval
as above, but for a sphere
generate a .xyz file from a 3-columns numeric text file
extract the coordinates of 1 atom only (e.g. in order to plot its trajectory)
evaluate and report a specific bond length for all frames
change atomic nature of atoms in regions where the local density is higher
than a given threshold
change atomic nature of atoms depending on their relation to a 2D (xy)
periodic potential function of given lattice parameter
change atomic nature of atoms depending on some property, either given
as a (extra) column of the xyz file or computed as a function of x, y, z
remove a specific atom from an xyz file
in an xyz file, remove atoms which have accidentally bumped into each
other, due to e.g. a merge of 2 xyz files, or to position randomization
(i) evaluate displacements between successive frames or (ii) change atomic
nature of atoms which move more than a given threshold
keep only first and last frame
evaluate the 2D or 3D g(r) correlation function averaging over all frames
and taking a periodic box into consideration
report all positions back into an arbitrary primitive cell
(the converse of xyz_lattice_translate)
replicate the atoms in an xyz file according to a Bravais lattice
(the converse of xyz_in_cell)
changes the chemical nature of a specific atom in an xyz file
amplify displacements (by a user-set factor) relative to the linear
interpolation between the first and the last frame
rotate rigidly an xyz file around an arbitrary axis by an angle
scale the positions by a factor
choose 1 frame out of a loooong xyz file
keep only one frame every n and discard the rest
evaluate the average number density in a sphere (circle) for particles
given in an xyz file
evaluate the structure factor S(q) for an array of Dirac deltas at the
particles positions provided by an xyz file. To spare time, compute half
q space. You can then use
to restore the missing half q space.
generate statistical info
generate an xyz file stretched by a factor (optionally in a given direction)
translate rigidly an xyz file in any direction
Students should also check useful instructions on how
to produce good hypertextual pdf and how
to generate good pdf/a
A quick look at some
Errors in Technical Writing also helps.
fix LaTeX accents according to Italian spelling rules
quick & efficient for converting LaTeX into html;
generate fairly clean and compact html;
useful also for LaTeX to .doc/.docx conversion,
since MS-Word /
can import html documents
- convertbiblio or the more
refined convertbiblio.py: wanna
use BibTeX to sort
\bibitem bibliography automatically, but never quite
dared to put them in the horrible bibtex format?
- latexcontinuo: Happy LaTeXing in 3
windows: editor +
+ latexcontinuo (possibly minimized).
lyx and other refined tools
n columns of data become a formatted LaTeX table, ready to be pasted
into a document
To use these scripts...
a nifty tool to organize jpg pictures & avi/mpg movies into folders
according to their EXIF data
- fix_eps_fonts occasional eps files
come with funny fonts which do not display well once included into latex.
This script manages to clear them and make them fully latex-compatible
- fix_jpg_tags use exiftool to
retrieve and correct the exif data from another picture with the same
initial part of the name (e.g. the raw file which originated a jpg
- images2pdf use
sam2p to convert (several)
bitmapped images into individual pdf files plus one global pdf with one
picture per page
- map_photo.py combine
replace or add geolocalization information to a picture with misplaced or
lacking GPS-localization exif data
- squeeze_eps reduce the size of those
huge eps files, using sam2p
- fire up a command-line terminal;
create the appropriate folder [
mkdir $HOME/bin] (in case it already
exist, no worry, you get an error message, but can do no harm);
save the script "
scriptname" in there, or save it wherever
your browser wants and then move it in there, e.g.
mv ~/Downloads/scriptname ~/bin].
NB! never use cut&paste
from the browser screen to download any script: use the browser "Save As"
function under "File"!
let your computer know it is executable
chmod +x ~/bin/scriptname].
After your log out and log back in "
scriptname" should become
a legitimate command, exactly like
it should run regardless of the folder you are working in.
If it does not work, most likely your
path needs fixing,
typically by adding a line of instructions to your
The simplest way to add this instruction is executing the command
echo '\nPATH=$HOME/bin:$PATH' >> $HOME/.bashrc .
(Please execute this command once only: repeated executions of this line
produce a dumb sequence of contradictory lines at the end of
In case you accidentally created them, please edit that file and delete all
but one line
After your next login your "
scriptname" should execute fine
from the command line.
Whenever it makes sense, these scripts throw their output to stdout,
i.e. output appears in the terminal window.
Their normal use is therefore
scriptname [options] inputfile [moreinputfiles] > outputfile
prunedata file_to_be_pruned.dat > pruned_file.dat
In case you forgot to redirect output with
> and your
terminal happens to be scrolling zillions of lines, then maybe you'd better
press CTRL-C to kill execution, and repeat your command line with an
Most script honor the -h option, which lists accepted command-line options.
A note on redirecting output:
command > somefile replaces
whatever content was in
somefile with the output
command. This may lead to the unwarned irreversible
destruction of potentially precious data.
If this makes you feel unsafe (it should!), then
test scripts in folders where you keep no precious unduplicated data,
- think twice before pressing
Comments / debugging / patches are welcome!