Nuclear Inst. and Methods in Physics Research, A 908 (2018) 206–214
Contents lists available at ScienceDirect
Nuclear Inst. and Methods in Physics Research, A
journal homepage: www.elsevier.com/locate/nima
Modeling of a HPGe well detector using PENELOPE for the calculation of
full energy peak efficiencies for environmental samples
J. G.Guerra
a,b,∗
, J. G.Rubiano
a,b
, G. Winter
c
, A. G.Guerra
a,b
, H. Alonso
a,b
, M.A. Arnedo
a,b
,
A. Tejera
a,b
, P. Martel
a,b
, J.P. Bolivar
d
a
Departamento de Física, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria, Spain
b
Instituto Universitario de Estudios Ambientales y Recursos naturales, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria, Spain
c
Instituto Universitario de Sistemas Inteligentes y Aplicaciones Numéricas en la Ingeniería, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran
Canaria, Spain
d
Departamento de Física Aplicada, Universidad de Huelva, 21071 Huelva, Spain
ARTICLE INFO
Keywords:
HPGe well detector
Efficiency calibration
Reference materials
Monte Carlo simulation
Gamma-ray spectrometry
ABSTRACT
When determining the activity concentration of radionuclides using gamma-ray spectrometry the Full Energy
Peak Efficiency (FEPE) for the energies of interest must be known. Determination of the FEPE can be made by
means of either experimental calibration or theoretical–mathematical methods, such as Monte Carlo simulations.
Given the difficulties related to experimental calibration and improvements in the capabilities of modern
computers, Monte Carlo simulation is an increasingly widely used alternative, but requires an accurate model
of the detector. The purpose of this work is to generate and validate a computational model, based on Monte
Carlo simulation, of an HPGe well detector that permits the performance of FEPE calculations with appropriate
precision and accuracy for the measurement of environmental samples. To achieve this, an optimization
methodology is applied to the model that minimizes the differences between a set of computational FEPEs and a
set of experimental FEPEs used as a benchmark. The resulting optimized model is used to calculate computational
FEPEs for 25 different samples with different reference materials and sample heights, which are measured by
means of the well detector modeled here. To validate the optimized model, the abovementioned computational
FEPEs are used during the calibration of the corresponding spectra, to enable the subsequent comparison of the
results of the analyses with the expected values. The measured activities differ from the reference values by less
than 10% in most cases and are compatible with these considering the uncertainties involved, thus confirming
the validity of the model.
1. Introduction
Gamma-ray spectrometry is a technique commonly used for the
identification and quantification of a wide range of radionuclides by
means of their direct gamma emissions or those of their progeny. HPGe
detectors are currently the most widely used gamma-ray detectors due
to their various advantages, of which high energy resolution is the most
important; this permits high-precision during the identification of the
peaks. For measurement of the radioactive content of environmental
samples of low volume and activity in laboratory, HPGe well detectors
are the best option due to their high detection efficiency, as the counting
geometry approaches 4 [1]; this implies improvements in both the
uncertainty of the results and the Minimum Detectable Activity. More
specifically, the use of this type of detector is recommended when
samples of only a few grams of organic matter are to be analyzed for
multidisciplinary studies, including radiological techniques [2,3]. These
∗
Corresponding author at: Departamento de Física, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria, Spain.
E-mail address: jglezg2002@gmail.com (J. G.Guerra).
aspects motivated us to acquire and setup a spectrometry system of this
type.
When the activity concentration of radionuclides in a sample is
intended to be quantified by gamma-ray spectrometry, the Full Energy
Peak Efficiency (FEPE) for the energies of interest must be determined.
FEPE determination can be made either by means of experimental cal-
ibration, or by theoretical–mathematical methods such as Monte Carlo
simulations. The experimental efficiency calibration [4–8] requires the
preparation of standard sources using reference materials with known
activities and with the same measuring geometry as the sample to be
measured. Materials with very similar densities and chemical compo-
sitions need to be used during the experimental calibration if self-
attenuation corrections are to be avoided, especially at low energies [1].
Moreover, when environmental samples are studied, the preparation of
standard sources that are very similar may be a process of considerable
complexity, involving appreciable economic and time costs.
https://doi.org/10.1016/j.nima.2018.08.048
Received 18 April 2018; Received in revised form 16 August 2018; Accepted 17 August 2018
Available online xxxx
0168-9002/© 2018 Elsevier B.V. All rights reserved.