optimizer#
Optimizer: various on-chip optimization algorthims
Classes
|
A base class for optimizers. |
|
Optimizer based on Bayesian optimization. |
|
Optimizer based on Fourier series. |
|
Optimizer based on SPSA (Simultaneous Perturbation Stochastic Approximation). |
- class Optimizer(target_func, param_init, random_state=0)[source]#
Bases:
objectA base class for optimizers.
- Parameters:
target_func – The target function to optimize, more specifically, to minimize. It is supposed to accept
kwargsin the format ofparam_initas inputs.param_init – The initial parameters for the target function. The keys of it should be consistent with inputs of
target_func.random_state – The random seed for this optimization process.
- class OptimizerBayesian(target_func, param_init, random_state=0)[source]#
Bases:
OptimizerOptimizer based on Bayesian optimization.
See bayesian-optimization/BayesianOptimization.
- Parameters:
target_func – The target function to optimize, more specifically, to minimize. It is supposed to accept
kwargsin the format ofparam_initas inputs.param_init – The initial parameters for the target function. The keys of it should be consistent with inputs of
target_func.random_state – The random seed for this optimization process.
Note
In the scenerio of on-chip optimization, the periods of phase shifters are all from 0 to \(2\pi\), so in this program the
pbound(a parameter determining the search region in Bayesian-Optimization package) is fixed from 0 to \(2\pi\).
- class OptimizerFourier(target_func, param_init, order=5, lr=0.1, random_state=0)[source]#
Bases:
OptimizerOptimizer based on Fourier series.
Obtain the gradient approximation from the target function approximation.
- Parameters:
target_func – The target function to optimize, more specifically, to minimize. It is supposed to accept
kwargsin the format ofparam_initas inputs.param_init – The initial parameters for the target function. The keys of it should be consistent with inputs of
target_func.order – The order of Fourier series to approximate.
lr – The learning rate of the learning process (namely, gradient descent process).
random_state – The random seed for this optimization process.
- class OptimizerSPSA(target_func, param_init, random_state=0)[source]#
Bases:
OptimizerOptimizer based on SPSA (Simultaneous Perturbation Stochastic Approximation).
See https://www.jhuapl.edu/spsa/Pages/MATLAB.htm.
- Parameters:
target_func – The target function to optimize, more specifically, to minimize. It is supposed to accept
kwargsin the format ofparam_initas inputs.param_init – The initial parameters for the target function. The keys of it should be consistent with inputs of
target_func.random_state – The random seed for this optimization process.