hypervolume | R Documentation |
Compute the hypervolume metric with respect to a given reference point assuming minimization of all objectives.
hypervolume(x, reference, maximise = FALSE)
x |
|
reference |
|
maximise |
|
The hypervolume of a set of multidimensional points A \subset
\mathbb{R}^d
with respect to a reference point \vec{r} \in
\mathbb{R}^d
is the volume of the region dominated by the set and
bounded by the reference point \citepZitThi1998ppsn. Points in A
that do not strictly dominate \vec{r}
do not contribute to the
hypervolume value, thus, ideally, the reference point must be strictly
dominated by all points in the true Pareto front.
More precisely, the hypervolume is the Lebesgue measure of the union of
axis-aligned hyperrectangles
(orthotopes), where each
hyperrectangle is defined by one point from \vec{a} \in A
and the
reference point. The union of axis-aligned hyperrectangles is also called
an orthogonal polytope.
The hypervolume is compatible with Pareto-optimality
\citepKnoCor2002cec,ZitThiLauFon2003:tec, that is, \nexists A,B
\subset \mathbb{R}^m
, such that
A
is better than B
in terms of Pareto-optimality and
\text{hyp}(A) \leq \text{hyp}(B)
. In other words, if
a set is better than another in terms of Pareto-optimality, the hypervolume
of the former must be strictly larger than the hypervolume of the latter.
Conversely, if the hypervolume of a set is larger than the hypervolume of
another, then we know for sure than the latter set cannot be better than the
former in terms of Pareto-optimality.
For 2D and 3D, the algorithms used
\citepFonPaqLop06:hypervolume,BeuFonLopPaqVah09:tec have O(n \log n)
complexity, where n
is the number of input points. The 3D case uses
the \text{HV3D}^{+}
algorithm \citepGueFon2017hv4d, which has the
sample complexity as the HV3D algorithm
\citepFonPaqLop06:hypervolume,BeuFonLopPaqVah09:tec, but it is faster in
practice.
For 4D, the algorithm used is \text{HV4D}^{+}
\citepGueFon2017hv4d,
which has O(n^2)
complexity. Compared to the original implementation, this implementation
correctly handles weakly dominated points and has been further optimized for
speed.
For 5D or higher, it uses a recursive algorithm
\citepFonPaqLop06:hypervolume with \text{HV4D}^{+}
as the base case,
resulting in a O(n^{d-2})
time complexity and O(n)
space
complexity in the worst-case, where d
is the dimension of the points.
Experimental results show that the pruning techniques used may reduce the
time complexity even further. The original proposal
\citepFonPaqLop06:hypervolume had the HV3D algorithm as the base case,
giving a time complexity of O(n^{d-2} \log n)
. Andreia P. Guerreiro
enhanced the numerical stability of the algorithm by avoiding floating-point
comparisons of partial hypervolumes.
numeric(1)
A single numerical value.
Manuel López-Ibáñez
data(SPEA2minstoptimeRichmond)
# The second objective must be maximized
# We calculate the hypervolume of the union of all sets.
hypervolume(SPEA2minstoptimeRichmond[, 1:2], reference = c(250, 0),
maximise = c(FALSE, TRUE))
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.