I am Mirna van Hoek. A physicist with an intrest in programming and solving problems.
My CV (Dutch version).
- Work at present.
- Work after return to the Netherlands.
- Slac and BaBar pics.
- Work at BaBar.
- Work at L3.
- Defence.
Work at present
I work at KNMI as a physicist involved in the Ozone Monitoring Instrument (OMI) project. OMI is one of the 4 instruments on the AURA satelite. It measures, among others, ozone, NO2, SO2, aerosols and clouds. As a member of the Instrument Operations Team (IOT) I am responsible for the health and safety of OMI and I am on call one week every two weeks. I ensure measurements are performed every day and I also developped the automatic monitoring program (IDL and Python) that monitors the instrument engeneering parameters for unexpected trends. During the ozone hole season there is less ozone at the south pole. This means less light is absorbed and this more light hits our instrument and could cause problems due to overexposure. IDL code monitors the ozone hole during the ozone hole season to be able to react in time to prevent over exposure of the CCD detectors. Other projects I have worked on are DEScTOP for development of simulation code for new satellite missions, like TropOMI and determining a correction for the tropopause height changes with latitude. This greatly reduces errors in the ozone profile algorithm.
Work after I returned to the Netherlands
After I returned to the Netwerlands I still worked one year for the University of Colorado. After the contract ended I was without a job and did some unpaid work.
Development OMI website
I have worked 2 months(+) at KNMI (Royal Dutch Meteorologic Institute). The OMI (Ozone Monitoring Instrument) project did not have much of a webpage yet. Rene is the OMI webmaster, but with hardly any time to work on the website. It was up to me to develop the style. I started from the new KNMI web style, but instead of using tables for layout I just used CSS. One requirement I had to deal with was that the site also had to look readable with the old browser Netscape 4.7 (some people with old workstations could not run a browser that was more up to date). This requirement gave me some headaches since CSS was not implemented fully in NS4.7. The answer was to define a separate stylesheet and let Javascript figure out which stylesheet to "serve up". Since OMI is a project on its own the style of the OMI pages had to differ enough from the new KNMI style. Which was not that much of a problem since the KNMI style did not provide enough possibilities for us to implement all features that we wanted. Now my work is done and it is up to Rene to fill the pages with content. The pages are now open to the public: www.knmi.nl/omi/
Development of online DB for NIMF
In 2004 I joint NIMF (Network for women in physics, mathematics and informatics). Someone at the ALV (general member meeting)(Feb.2004) (the first meeting I attended) suggested to have the networkguide (with information about each member) online. Some thought it would be too difficult or that security would be a problem. I thought: "How difficult can it be?". But since it was my first meeting and I did not know anybody there I did not feel like volunteering immediately. I did that (volunteering) a month or two later. I did not know anything about MySQL (relations database; like Oracle except free) or PHP (scripting language). But I had the book on MySQL from O'Reilly which contained an example of a wedding gift registry DB coded in PHP. First, I had to install PHP, Apache and MySQL on my laptop. I started by designing the DB. I filled it with some bogus rows. Then I started designing the online guide. First by typing the name you would get the information of the person displayed on your screen. Slowly I extended it to enable updates (only possible by the person themselves or the DB manager) and to searches (name of person, employer, city and ages). With each step I learned a lot. End of June I presented my work to others (who seemed rather impressed). After the summer holiday the "bestuur" decided to go ahead with the project. I received a copy of the old networkguide information and had to press it into the new format. Yes press it. Some entries in the old format were rather loose. For instance: career was one entrie in the old format. Basically everybody had filled in that entrie in her own way. In the new format there was an entrie for every job with periode, employer and description. I wrote a Perl script to do the work (but at some point I encountered so many exceptions that I began to wonder if it would not have been quicker to type al the entries by hand). Now I was able to fill the DB on my laptop with real data. In the meantime NIMF acquired a domain name: stichtingnimf.nl. It was hosted by prepaidwebhost. I uploaded the scripts to the webhost and created a DB there. In January 2005 we were ready for testing. About 12 people tested the DB and gave their suggestions: foto upload possibility; possibility to change the adres information of an employer... A big issue was security. After all we are dealing with people's personal information here. In the end we ended up with a shared SSL certificate of prepaidwebhost. You do get a warning each time you connect to the online DB (something about certificate being issued for prepaidwebhost and not stichtingnimf. In other words: there might be something fishy going on (but not likely)), but it serves our purposes well enough (after all we are not a bank or something). After providing a means to print a user friendly pages (or pages) (using the FPDF class to make an pdf document) and updating the information in the DB to april 2005, the online DB is ready to go live. The date has been set for 1th July 2005. I have already written the script to make accounts for all 115 (latest count) members, so I am all set to go. In the mean time I have also become a member of the PR committee of NIMF. There will be a symposium in november so we will have our work cut out for us.
Work in BaBar
Until december 2002 I worked for the University of
Colorado, Boulder and I was stationed at
SLAC until the end of april 2002. From may
until december I worked from the Netherlands.
I worked in the BaBar
collaboration (see the
BaBar public pages when you are not a BaBarian).
BaBar is a detector
that was built to study the millions of B mesons produced by the
PEP-II storage ring.
The primary goal is to study and measure CP violation in the neutral B system.
High energy physics requires large and expensive particle colliders and
detectors. A single university or institute can not afford these costs. That
is why high energy physics is always an international effort.
There is a collaboration of universities and institutions from all over
the world involved in
BaBar.
Some people are stationed at
SLAC (in California),
but others are stationed at their home university/lab elsewhere in the US,
Europe, whereever. Meetings are always held in the morning (pacific time) at 8AM
(which makes it 5PM in Europe). Presentations are always posted on the web
(as HTML, postcript, pdf or even ppt (but that is not always appreciated by
everybody)) so everybody anywhere can look at the presentation. It requires also some
discipline from the people actually present in the meeting room at
SLAC to
speak clearly and into the microphone and not place the mike infront of
the fan of the overhead projector or some such thing.
Besides the meetings we stay upto date of lock, stock and barrel via the
hypernews (BaBarians only)
with its many different forums for simulation, physics, detector, database,
reviews, etc.
Unlike other experiments, where it is sometimes very difficult to stay
upto date when you are staying at your home institution, you can keep good
upto date of what is going on in BaBar
even when staying at your home institution.
To find out more about high energy physics, detectors etc., visit the
Slac virtual visitor center.
Simulation
I was responsible for the detector response simulation of the driftchamber for two years.
During that time I developed a better method of
modelling (BaBarians only) the inefficiency of the chamber.
The first method depended on the pathlength that a particle travels through
a cell (and thus the amount of charge that is deposited on the wire). Unfortunately
this method depended on the HV setting that the chamber was operated at.
And that changed three times in two years! In a second iteration of the code
I developped a
method (BaBarians only) that is independed from the HV setting of the chamber.
Before detector response issues can be taken into account we first need to generate
particles and track them through the different detector volumes. We made use of the
Geant3 toolkit (written in Fortran) for this.
This packages was redesigned and rewritten using C++ and object oriented technology and released by CERN as the
Geant4 toolkit.
In order to use it a new application layer needed to be build on the Geant4
toolkit which we called
Bogus
(BaBar Object-oriented Geant4-based Unified Simulation).
It defines the detector sensitive volumes, step size, calculates positions and
idealized energy deposits in the detector (referred to as "GHits") and stores
this information in the Objectivety database.
A lot of code writing and testing needed to be done before we could take it
in production. For instance the simulation code based on Geant3 used to make
hits in each cell of the driftchamber (which takes a lot of time because it needs
to check in each step were it is and whether to produce a hit in the cell or not).
When we switched to Geant4 and Bogus it was decided to produce layer hits (the chamber has
40 layers, so that is only 40 steps per particle). A faster algorithm was provided
by me to produce the actual cell hits that are needed in the detector response
step of the simulation.
Furthermore, the Geant4/Bogus code needed to be tested. Comparing Geant3 simulated
events with Geant4/Bogus simulated events (and real data) helped to spot differences
that needed to be investigated further. Sometimes differences were expected due to
improvements in the Geant4/Bogus code, but sometimes they indicated a bug in the
Geant4/Bogus code.
Another big project that I was involved in was providing a more realistic
trigger simulation (the driftchamber provides half of the trigger information).
This meant that for inefficiency modelling also the trigger needed to be checked
to model the data, not only the driftchamber.
In simulation only physics events are generated. But in real life you will
have also events/tracks that are produced by interactions with for instance the beampipe,
socalled machine background.
We take special data runs where there are no collision of the electron and
proton beams so you only have the machine background.
This backround data is mixed in with the simulated events to give a more
realistic picture. Ofcourse timing (between real data backround and
simulated physics) is an important issue here. During our work to
provide a more realistic trigger simulation we also needed to revisit this timing
issue and had to involve the other sub systems too so we would do it consistent
throughout the detector.
In BaBar we work with two types of releases; even numbers for production running
and uneven numbers for development. When the development release becomes
production release is determined by data taking which is in itself determined
by the conference schedule. So basically all the work we do is under the
deadline of conferences.
Analysis
In Colorado we analyse rare B decay modes. We usually refer to
them as charmless modes, because these B decays do not involve a charm quark
(B decays involving charm quark(s) are quit common). Because the rarity of these
B decay occurances we need a lot of statistics to find these rare modes.
Lucky BaBar is a B-factory.
To avoid experimentalist bias, the analysis is done blind, i.e., we do not
look at the number of signal events untill we are
satisfied with the method with which it was derived.
We are required to regulary present a progress report to the analysis group.
The analysis also has to be written up in an analysis document.
Each reasonably far advanced analysis gets an review committee assigned.
This committee consists of 3-4 people from the analysis group or outside
the group who have to read the analysis document and judge if the analysis
is sound and eventually if the result can be unblinded (if we can look at
the answer).
I have been involved in measuring the branching ratio of:
B -> eta K*.
The analysis involves analysing 8 modes (eta decays in two ways and K* decays
in 4 ways) not to mention the 4 B-> eta rho modes (rho decays two ways) that
are a special background for this analysis. It would be a pain to have
for each mode a specialized program that is the same except for a couple
of places in the code, let alone to keep each of them up to date.
Therefore, I have developed a general
user package to handle more easily
several different quasi-twobody modes. Out of a 100 million events
the 1000 or so possibly interesting events are selected and more detailed
information (masses, momenta of particles and event quantities that can be
used to separate signal from background) are calculated and written out
in ntuple (PAW) or rootuple (ROOT) format. The code can easilly deal with other
modes than eta K* by just changing a few lines in a configuration (tcl) file. So
new modes can be analysed more quickly and students can also more easilly be involved.
The ntuples/rootuples are then combined in a seperate analysis
job for a simple cut-and-count analysis (make stringent selection to remove
as much background as possible; count in the signal region how many events you
end up with; estimate how much background you expect in the signal region;
calculate your signal; estimate the significance of the result) or a maximum
likelihood fit. The likelihood fit is a multidimensional one. Observables
are usually B mass and energy, a (Fisher) discriminant (to disciminate between
signal and background), resonance masses and where useful orientation. The
simplest fits deal with only two components: signal and combinatoric background.
But contributions from other types of background can also be taken into
account. For instance B->eta rho can mimic B->etaK* if one of the pions from the rho
is mistaken for a kaon and thus ends up forming a K* with the other pion instead
of a rho. Either a stringent selection is performed to remove most or all of this
background (keeping the signal events) or the extra component is explicetly taken into
account in the likelihood fit.
For some of these rare decays we can measure also CP violation (if we have
gathered enough statistics). This complicates the analysis a lot.
We need to add a timedependent term to the maximum likelihood fit, tag each
event to be a B0 or a B0bar (i.e., tag wether the B meson contains a b quark or an
anti b quark) and estimate how often we make a wrong tag.
There already exists in BaBar the
RooFit package that takes care of the hard core fitting issues.
Building up the fit for a special mode using
RooFit building blocks is easy and very
flexible, but setting up to fit an other mode might require different building blocks
(for instance not all resonance mass distributions have the same shape) and involves
a lot of akward copy and pasting and adapting work.
Therefore, I have developed a
fit package
on top of the already existing RooFit
package that takes care of most of the work of setting up the fit with different building
blocks; it has many switches to provide a lot of freedom and is configurable for
a different mode with a few lines of change in the configuration file.
Both the general user package and the fit package are now also used outside the Colorado group.
Work in L3
Before this I worked at the University of Nijmegen
(KUN) as PhD student in the
HEFIN group, which stands for
High Energy Physics in Nijmegen.
I have worked in the L3 collaboration.
L3 is one of the four
LEP experiments at CERN.
CERN is the European center for
High Energy and Particle Physics at Geneva
in Switserland.
I was responsible for monitoring of the L3 Silicon Microvertex Detector
during its first year of operation.
I have worked in the Heavy Flavour Analysis Group. My thesis subject is:
Excited Beauty at L3.
This analysis was concerned with the detection of excited B mesons (particles
with a b quark).
The b quark is heavy compared to the light u and d quarks.
The interaction of a light quark with a heavy one is like that of an ant
with an elephant. The ant has absolutely no clue whether it is an African
elephant or an Asian elephant (or maybe a rhinoceros); It only knows it is
something heavy. In the same way the light quark knows neither the `flavour'
(what kind of) heavy quark it is interacting with, nor its spin.
In an effective theory (HQET) the mass of the heavy quark is set to infinity,
which makes calculations a lot easier. And due to the flavour symmetry
predictions can easily be made by extrapolating the results from one
heavy quark to another. Corrections arise, since the mass of a real-life
heavy quark has a finite mass.
An excited B meson, denoted as B**, decays to B(*) and a pion.
Due to the good Electromagnetic calorimeter of the
L3 experiment it was possible
to measure the neutral pion channel besides the (easier measurable)
charged pion channel. The results from the charged channel were consistent
with those obtained by other experiments. The mass obtained with
the neutral channel, however, was too low. It is not clear, why that is the
case.
The results are available in the form of a nicely printed Ph.D.
thesis. If you are interested, let me know and I'll sent you one (free
of costs). Or view my thesis here.
When I was back in Nijmegen I was a teaching assistent:
nuclear physics for physics students and electromagnetism for chemistry students.
Defence
I have defended my thesis the 29th of September 1999.
Do you have questions or comments?
Return to my homepage
Updated 03 august 2011 , Mirna van Hoek