knitr::opts_chunk$set( collapse = TRUE, comment = "#>", fig.path = "man/figures/README-" )
lexl separates Excel formulas into tokens of
different types, and gives their depth within a nested formula. Its name is a
bad pun on 'Excel' and 'lexer'. Try the online
demo or run demo_lexl()
locally.
You can install lexl from github with:
# install.packages("devtools") devtools::install_github("nacnudus/lexl")
library(lexl) x <- lex_xl("MIN(3,MAX(2,A1))") x plot(x) # Requires the ggraph package
Not all parse trees are the same. The one given by lex_xl()
is intended for
analysis, rather than for computation. Examples of the kind of analysis that it
might support are:
The tidyxl package imports formulas from xlsx (spreadsheet) files.
The Enron corpus contains thousands of real-life spreadsheets.
Research by Felienne Hermans inspired this package, and the related XLParser project was a great help in creating the grammar.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.