SBOAtools is an R package for the Secretary Bird Optimization Algorithm (SBOA). The package supports both general-purpose continuous optimization and single-hidden-layer multilayer perceptron (MLP) training.
It is designed for researchers working in metaheuristic optimization, computational intelligence, and neural network training. The package allows users to apply SBOA either as a standalone optimizer or as a training algorithm for feed-forward neural networks.
sboa()sboa_mlp()F1-F23)list_benchmarks()get_benchmark()predict()plot()print()During development, the package can be installed from the local source using:
devtools::install()
Then load the package with:
library(SBOAtools)
You can also install the development version from GitHub:
install.packages("remotes")
remotes::install_github("burakdilber/SBOAtools")
sboa()Performs general-purpose continuous optimization using the Secretary Bird Optimization Algorithm.
sboa_mlp()Trains a single-hidden-layer multilayer perceptron using the Secretary Bird Optimization Algorithm.
list_benchmarks()Displays the built-in benchmark functions available in the package.
get_benchmark()Returns a benchmark definition and its metadata.
SBOAtools includes 23 built-in benchmark functions (F1-F23) for continuous optimization studies.
You can inspect the available benchmark functions with:
list_benchmarks()
You can retrieve a benchmark definition and its metadata with:
b <- get_benchmark("F9")
b$label
b$category
b$fn(rep(1, 5))
The built-in benchmark set includes unimodal, multimodal, and fixed-dimension functions.
library(SBOAtools)
sphere <- function(x) sum(x^2)
res <- sboa(
fn = sphere,
lower = rep(-10, 5),
upper = rep(10, 5),
n_agents = 10,
max_iter = 20,
seed = 123,
verbose = FALSE
)
print(res)
plot(res)
res$value
res$par
library(SBOAtools)
list_benchmarks()
res2 <- sboa(
fn = "F1",
lower = rep(-100, 30),
upper = rep(100, 30),
n_agents = 30,
max_iter = 500,
seed = 123,
verbose = FALSE
)
print(res2)
plot(res2)
You can also inspect a specific benchmark before optimization:
b <- get_benchmark("F14")
b$label
b$fixed_dim
library(SBOAtools)
set.seed(123)
X_train <- matrix(runif(40), nrow = 10, ncol = 4)
y_train <- matrix(runif(10), nrow = 10, ncol = 1)
fit_mlp <- sboa_mlp(
X_train = X_train,
y_train = y_train,
hidden_dim = 3,
n_agents = 10,
max_iter = 20,
lower = -1,
upper = 1,
seed = 123,
verbose = FALSE
)
print(fit_mlp)
plot(fit_mlp)
pred <- predict(fit_mlp, X_train)
pred
sboa()The sboa() function returns an object of class "sboa" containing:
par: best solution foundvalue: best objective function valueconvergence: convergence curve over iterationspopulation: final population matrixfitness: final fitness values of the populationcall: matched function callsboa_mlp()The sboa_mlp() function returns an object of class "sboa_mlp" containing:
par: optimized neural network parametersvalue: best objective function valueconvergence: convergence curve over iterationsinput_dim: number of input variableshidden_dim: number of hidden neuronsoutput_dim: number of output variablesx_min: minimum values used for input normalizationx_max: maximum values used for input normalizationy_min: minimum values used for output normalizationy_max: maximum values used for output normalizationfitted: fitted values on the original scalemetrics: training performance metricscall: matched function callThe current version of the package supports:
Possible future improvements include:
sboa()Fu, W., Wang, K., Liu, J., et al. (2024). Secretary Bird Optimization Algorithm. Artificial Intelligence Review. https://doi.org/10.1007/s10462-024-10729-y
Dilber, B., & Ozdemir, A. F. (2026). A novel approach to training feed-forward multi-layer perceptrons with recently proposed secretary bird optimization algorithm. Neural Computing and Applications. https://doi.org/10.1007/s00521-026-11874-x
MIT License
Any scripts or data that you put into this service are public.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.