README.md

SBOAtools

SBOAtools is an R package for the Secretary Bird Optimization Algorithm (SBOA). The package supports both general-purpose continuous optimization and single-hidden-layer multilayer perceptron (MLP) training.

It is designed for researchers working in metaheuristic optimization, computational intelligence, and neural network training. The package allows users to apply SBOA either as a standalone optimizer or as a training algorithm for feed-forward neural networks.

Features

Installation

During development, the package can be installed from the local source using:

devtools::install()

Then load the package with:

library(SBOAtools)

You can also install the development version from GitHub:

install.packages("remotes")
remotes::install_github("burakdilber/SBOAtools")

Main Functions

sboa()

Performs general-purpose continuous optimization using the Secretary Bird Optimization Algorithm.

sboa_mlp()

Trains a single-hidden-layer multilayer perceptron using the Secretary Bird Optimization Algorithm.

list_benchmarks()

Displays the built-in benchmark functions available in the package.

get_benchmark()

Returns a benchmark definition and its metadata.

Built-in Benchmark Functions

SBOAtools includes 23 built-in benchmark functions (F1-F23) for continuous optimization studies.

You can inspect the available benchmark functions with:

list_benchmarks()

You can retrieve a benchmark definition and its metadata with:

b <- get_benchmark("F9")
b$label
b$category
b$fn(rep(1, 5))

The built-in benchmark set includes unimodal, multimodal, and fixed-dimension functions.

Example 1: General Optimization with a User-Defined Function

library(SBOAtools)

sphere <- function(x) sum(x^2)

res <- sboa(
  fn = sphere,
  lower = rep(-10, 5),
  upper = rep(10, 5),
  n_agents = 10,
  max_iter = 20,
  seed = 123,
  verbose = FALSE
)

print(res)
plot(res)

res$value
res$par

Example 2: General Optimization with a Built-in Benchmark

library(SBOAtools)

list_benchmarks()

res2 <- sboa(
  fn = "F1",
  lower = rep(-100, 30),
  upper = rep(100, 30),
  n_agents = 30,
  max_iter = 500,
  seed = 123,
  verbose = FALSE
)

print(res2)
plot(res2)

You can also inspect a specific benchmark before optimization:

b <- get_benchmark("F14")
b$label
b$fixed_dim

Example 3: MLP Training with SBOA

library(SBOAtools)

set.seed(123)

X_train <- matrix(runif(40), nrow = 10, ncol = 4)
y_train <- matrix(runif(10), nrow = 10, ncol = 1)

fit_mlp <- sboa_mlp(
  X_train = X_train,
  y_train = y_train,
  hidden_dim = 3,
  n_agents = 10,
  max_iter = 20,
  lower = -1,
  upper = 1,
  seed = 123,
  verbose = FALSE
)

print(fit_mlp)
plot(fit_mlp)

pred <- predict(fit_mlp, X_train)
pred

Returned Objects

Output of sboa()

The sboa() function returns an object of class "sboa" containing:

Output of sboa_mlp()

The sboa_mlp() function returns an object of class "sboa_mlp" containing:

Current Scope

The current version of the package supports:

Future Extensions

Possible future improvements include:

Authors

References

License

MIT License



Try the SBOAtools package in your browser

Any scripts or data that you put into this service are public.

SBOAtools documentation built on May 3, 2026, 9:06 a.m.