Edilson Fernandes de Arruda

Orcid: 0000-0002-9835-352X

According to our database1, Edilson Fernandes de Arruda authored at least 29 papers between 2004 and 2023.

Collaborative distances:
  • Dijkstra number2 of five.
  • Erdős number3 of four.

Timeline

Legend:

Book 
In proceedings 
Article 
PhD thesis 
Dataset
Other 

Links

Online presence:

On csauthors.net:

Bibliography

2023
Modelling lung cancer diagnostic pathways using discrete event simulation.
J. Simulation, January, 2023

2022
Epidemic Control Modeling using Parsimonious Models and Markov Decision Processes.
CoRR, 2022

2021
A Novel Stochastic Epidemic Model with Application to COVID-19.
CoRR, 2021

Modelling and Optimal Control of Multi Strain Epidemics, with Application to COVID-19.
CoRR, 2021

2020
Learning-agent-based simulation for queue network systems.
J. Oper. Res. Soc., 2020

Dimensionality reduction for multi-criteria problems: An application to the decommissioning of oil and gas installations.
Expert Syst. Appl., 2020

Flattening the curves: on-off lock-down strategies for COVID-19 with an application to Brazi.
CoRR, 2020

Optimisation and control of the supply of blood bags in hemotherapic centres via Markov decision process with discounted arrival rate.
Artif. Intell. Medicine, 2020

Mine-to-client planning with Markov Decision Process.
Proceedings of the 18th European Control Conference, 2020

2019
A multi-cluster time aggregation approach for Markov chains.
Autom., 2019

Optimal testing policies for diagnosing patients with intermediary probability of disease.
Artif. Intell. Medicine, 2019

2018
Long-term integrated surgery room optimization and recovery ward planning, with a case study in the Brazilian National Institute of Traumatology and Orthopedics (INTO).
Eur. J. Oper. Res., 2018

Oil industry value chain simulation with learning agents.
Comput. Chem. Eng., 2018

2017
Multi-partition time aggregation for Markov Chains.
Proceedings of the 56th IEEE Annual Conference on Decision and Control, 2017

2016
Discounted Markov decision processes via time aggregation.
Proceedings of the 15th European Control Conference, 2016

2015
Solving average cost Markov decision processes by means of a two-phase time aggregation algorithm.
Eur. J. Oper. Res., 2015

2013
Accelerating the convergence of value iteration by using partial transition functions.
Eur. J. Oper. Res., 2013

2012
Optimal Approximation Schedules for a Class of Iterative Algorithms, With an Application to Multigrid Value Iteration.
IEEE Trans. Autom. Control., 2012

Window Walker - a Markov-Based Adaptive Bit Window Selection for DSP Blocks.
J. Circuits Syst. Comput., 2012

A two-phase time aggregation algorithm for average cost Markov decision processes.
Proceedings of the American Control Conference, 2012

2011
Time aggregated Markov decision processes via standard dynamic programming.
Oper. Res. Lett., 2011

Approximate dynamic programming via direct search in the space of value function approximations.
Eur. J. Oper. Res., 2011

2010
Toward an optimized value iteration algorithm for average cost Markov decision processes.
Proceedings of the 49th IEEE Conference on Decision and Control, 2010

2009
Standard dynamic programming applied to time aggregated Markov decision processes.
Proceedings of the 48th IEEE Conference on Decision and Control, 2009

2008
Stability and optimality of a multi-product production and storage system under demand uncertainty.
Eur. J. Oper. Res., 2008

An application of convex optimization concepts to approximate dynamic programming.
Proceedings of the American Control Conference, 2008

2007
Optimal approximation schedules for iterative algorithms with application to dynamic programming.
Proceedings of the 46th IEEE Conference on Decision and Control, 2007

2006
Approximate Dynamic Programming Based on Expansive Projections.
Proceedings of the 45th IEEE Conference on Decision and Control, 2006

2004
Stability and optimality of a discrete production and storage model with uncertain demand.
Proceedings of the 43rd IEEE Conference on Decision and Control, 2004


  Loading...