Post on 26-Jan-2017
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
Interpretable Bayesian Functional LinearRegression
Paul-Marie Grollemund
Christophe Abraham, Meli Baragatti etPierre Pudlo
University of Montpellier
1 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
Content
Introduction
Model
Application
Conclusion
1 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
Content
IntroductionDataFunctional regressionAim
Model
Application
Conclusion
1 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
DataSample of n observations :
The real response Y :(Yi , 1 i n
)(average number of grains of maize per plant)
585.02375.64
...358.49
The functional covariate X :(Xi (.), 1 i n
)(temperature curve)
0 20 40 60 80
510
1520
2530
Day
Averag
e daily
temper
ature
2 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
Functional regression
Explain the number of grains with the temperature
Model
Yi = +T
Xi (t)(t)dt + i for i = 1, . . . , n
where i is a Gaussian noise
Three methods :
FDA (Ramsay and Silverman 2005) FLiRTI (James et al. 2009) Fused Lasso (Tibshirani et al. 2005)
3 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
Functional regression
Explain the number of grains with the temperatureModel
Yi = +T
Xi (t)(t)dt + i for i = 1, . . . , n
where i is a Gaussian noise
Three methods :
FDA (Ramsay and Silverman 2005) FLiRTI (James et al. 2009) Fused Lasso (Tibshirani et al. 2005)
3 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
Functional regression
Explain the number of grains with the temperatureModel
Yi = +T
Xi (t)(t)dt + i for i = 1, . . . , n
where i is a Gaussian noise
Three methods :
FDA (Ramsay and Silverman 2005) FLiRTI (James et al. 2009) Fused Lasso (Tibshirani et al. 2005)
3 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
Functional regressionExplain the number of grains with the temperatureModel
Yi = +T
Xi (t)(t)dt + i for i = 1, . . . , n
where i is a Gaussian noise
0.0 0.2 0.4 0.6 0.8 1.0
0.0
50.
000.
050.
100.
15
FDA
Support
Target functionEstimate
Ramsay and Silverman (2005)
3 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
Functional regressionFind relevant periodsModel
Yi = +T
Xi (t)(t)dt + i for i = 1, . . . , n
where i is a Gaussian noise
0.0 0.2 0.4 0.6 0.8 1.0
0.0
50.
000.
050.
100.
15
FLiRTI
Support
Target functionEstimateCI 95%
James et al. (2009)3 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
Functional regressionFind relevant periodsModel
Yi = +T
Xi (t)(t)dt + i for i = 1, . . . , n
where i is a Gaussian noise
0.0 0.2 0.4 0.6 0.8 1.0
0.0
50.
000.
050.
100.
15
Fused Lasso
Support
Target functionEstimate
Tibshirani et al. (2005)3 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
Aim
Construct a method Stable with respect to its tuning parameters Can include prior knowledge Provide an evaluation of its confidence Produce interpretable estimators
0.0 0.2 0.4 0.6 0.8 1.0
0.0
0.2
0.4
0.6
0.8
Support
NoninterpretableInterpretable
4 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
Aim
Construct a method Stable with respect to its tuning parameters Can include prior knowledge Provide an evaluation of its confidence Produce interpretable estimators
0.0 0.2 0.4 0.6 0.8 1.0
0.0
0.2
0.4
0.6
0.8
Support
NoninterpretableInterpretable
4 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
Content
Introduction
ModelSparsityBayesian modelingInferenceImplementation
Application
Conclusion
4 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
Sparsity
Y |X , , , 2 Nn(1n +
T
X (t)(t)dt , 2In)
The set of interpretable functions E
(t) =K
k=1
k 1 {t Ik} (1)
0 20 40 60 80 100
0.00
0.01
0.02
0.03
0.04
0.05
0.06
0.07
Function beta_0
Support
0(t)
5 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
Sparsity
Y |X , , , 2 Nn(1n +
T
X (t)(t)dt , 2In)
The set of interpretable functions E
(t) =K
k=1
k 1 {t Ik} (1)
0 20 40 60 80 100
0.00
0.01
0.02
0.03
0.04
0.05
0.06
0.07
Function beta_0
Support
0(t)
5 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
Sparsity
Y |X , , , 2 Nn(1n +
T
X (t)(t)dt , 2In)
The set of interpretable functions E
(t) =K
k=1
k 1 {t Ik} (1)
0 20 40 60 80 100
0.00
0.01
0.02
0.03
0.04
0.05
0.06
0.07
Function beta_0
Support
0(t)
5 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
Sparsity
Y |X , , , 2 Nn(1n +
T
X (t)(t)dt , 2In)
The set of interpretable functions E
(t) =K
k=1
k 1 {t Ik}
0 20 40 60 80 100
0.00
0.01
0.02
0.03
0.04
0.05
0.06
0.07
Relevant regions
Support
I1
I2
5 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
Sparsity
Y |X , , , 2 Nn(1n +
T
X (t)(t)dt , 2In)
The set of interpretable functions E
(t) =K
k=1
k 1 {t Ik}
0 20 40 60 80 100
0.00
0.01
0.02
0.03
0.04
0.05
0.06
0.07
Function beta
Support
I1
I2
*1
*2
Constrained functionFunction beta_0
(t)
5 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
Bayesian modeling
When E , the integrant can be rewrittenT
X (t)(t)dt =K
k=1
k
Ik
X (t)dt
= X I
Parametric model
Y |X , , , 2, I Nn(1n + X I , 2In
)|2 N
(0, v02
), 2 NIGK (,V , a, b)I I(.)
The intervals Ik can overlap K large enough to detect all relevant regions We can fix hyperparametres so that the prior is weakly informative
6 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
Bayesian modeling
When E , the integrant can be rewrittenT
X (t)(t)dt =K
k=1
k
Ik
X (t)dt
= X I
Parametric model
Y |X , , , 2, I Nn(1n + X I , 2In
)
|2 N(0, v02
), 2 NIGK (,V , a, b)I I(.)
The intervals Ik can overlap K large enough to detect all relevant regions We can fix hyperparametres so that the prior is weakly informative
6 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
Bayesian modeling
When E , the integrant can be rewrittenT
X (t)(t)dt =K
k=1
k
Ik
X (t)dt
= X I
Parametric model
Y |X , , , 2, I Nn(1n + X I , 2In
)|2 N
(0, v02
), 2 NIGK (,V , a, b)I I(.)
The intervals Ik can overlap K large enough to detect all relevant regions We can fix hyperparametres so that the prior is weakly informative
6 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
Bayesian modeling
When E , the integrant can be rewrittenT
X (t)(t)dt =K
k=1
k
Ik
X (t)dt
= X I
Parametric model
Y |X , , , 2, I Nn(1n + X I , 2In
)|2 N
(0, v02
), 2 NIGK (,V , a, b)I I(.)
The intervals Ik can overlap K large enough to detect all relevant regions We can fix hyperparametres so that the prior is weakly informative
6 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
Inference
Bayes estimator
For a loss function L, the Bayes estimator of f () based on the data Dis
f () argmind
L(d , f ()
)(|D) d
(t) = f(, I
; t)=
Kk=1
k 1{
t Ik}
L2-loss and posterior expected value
For the L2-loss function, the Bayes estimator of is
(.) =
(.) (, I|Y ) ddI
Problem : / E , the set of interpretable function (non-convexity of E ).
7 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
Inference
Bayes estimator
For a loss function L, the Bayes estimator of f () based on the data Dis
f () argmind
L(d , f ()
)(|D) d
(t) = f(, I
; t)=
Kk=1
k 1{
t Ik}
L2-loss and posterior expected value
For the L2-loss function, the Bayes estimator of is
(.) =
(.) (, I|Y ) ddI
Problem : / E , the set of interpretable function (non-convexity of E ).
7 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
Inference
Bayesian modeling and constraints
(estimator) argmin
(loss)
(constraints)
(likelihood) (prior)
(constraints)
8 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
Inference
Bayesian modeling and constraints
(estimator) argmin
(loss)
(constraints)
(likelihood) (prior)
(constraints)
8 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
Inference
Bayesian modeling and constraints
(estimator) argmin
(loss)
(constraints)
(likelihood) (prior)
(constraints)
8 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
Inference
Bayesian modeling and constraints
(estimator) argmin
(loss)
(constraints)
(likelihood) (prior)
(constraints)
New loss function
L(, d) = d22 1 {d E } + 1 {d / E }
where E denotes the set of the interpretable functions
8 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
Inference
Bayesian modeling and constraints
(estimator) argmin
(loss)
(constraints)
(likelihood) (prior)
(constraints)
New loss function
L(, d) = d22 1 {d E } + 1 {d / E }
where E denotes the set of the interpretable functionsEstimator of
Under the loss function L(d , ) = d22 1 {d E } + 1 {d / E },the Bayes estimator is
argmind
L(d , ) (, I|Y ) ddI
8 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
Implementation
Posterior
Non-conjugate prior for I tractable full
conditional distributions
Gibbs sampler
Estimator
Monte Carlo to integrate L over the posterior
1N
Ni=1
L(d , i ) L(d , ) (, I|Y ) ddI
Simulated annealing to optimize the criterion
9 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
Content
Introduction
Model
ApplicationSimulation studyAgronomic data
Conclusion
9 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
Simulation study
Data simulation :
For X (t) :
Simulate 50 curves onto a grid of 100equally spaced points
0.0 0.2 0.4 0.6 0.8 1.0
2
02
4
Simulation of curves X(t) n=50 and p=100
Support
For :
Pick a function (to estimate)
0.0 0.2 0.4 0.6 0.8 1.0
2
1
01
23
45
Function beta
Support
Simulate a Gaussian noise such that V[Y ]/V[] 4Build Yi from the model
10 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
Simulation study
0.0 0.2 0.4 0.6 0.8 1.0
50
510
FLiRTI
Support
Target functionEstimationIC 95%
0.0 0.2 0.4 0.6 0.8 1.0
20
24
Fused Lasso
Support
Target functionEstimation
11 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
Simulation study
0.0 0.2 0.4 0.6 0.8 1.0
4
2
02
46
Support
0.0 0.2 0.4 0.6 0.8 1.0
4
2
02
46
Support
Target functionEstimate
12 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
Agronomic data
29 observations (different environmental conditions)
X : Temperature curve over 80 days
0 20 40 60 80
510
1520
2530
Day
Averag
e daily
temper
ature
Franois Tardieu
Claude Welcker
Emilie Millet
Y : Average of grains of maize
Y1 Y2 Y3 . . . Y28 Y29585.02 375.64 176.09 . . . 96.38 441.22
13 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
Agronomic data
FLiRTI
0 20 40 60 80
30
0
200
10
00
100
Support
EstimationIC 95%
Fused Lasso
0 20 40 60 80
3
2
1
01
Support
14 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
Agronomic data
Loss L :
4
2
02
4
< >
growth< >
reproductive system< >flowering
Mode :
4
2
02
4
< >
growth< >
reproductive system< >flowering
15 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
Content
Introduction
Model
Application
Conclusion
15 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
Conclusion
Produce estimates with very simple shape Provide a convenient representation of the posterior
Perspectives
Can include prior knowledge must be generalized to a model including an extra categorical
variable
theoretical result
16 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
17 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
E is non convex
0.0 0.2 0.4 0.6 0.8 1.0
2
1
01
23
Support
fgmean(f,g)
17 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
Bad behavior of the mode
Loss L :
0.0 0.2 0.4 0.6 0.8 1.0
4
2
02
46
Support
Target functionEstimate
Mode :
0.0 0.2 0.4 0.6 0.8 1.0
4
2
02
46
Support
Target functionEstimate
17 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
Bad behavior of the mode
Loss L :
0.0 0.2 0.4 0.6 0.8 1.0
2
1
01
23
Support
Target functionEstimate
Mode :
0.0 0.2 0.4 0.6 0.8 1.0
2
1
01
23
Support
Target functionEstimate
17 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
Generalization to other formsGaussian kernel :
0.0 0.2 0.4 0.6 0.8 1.0
5
05
10
EstimationFonction cible
17 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
Generalization to other formsTriangular kernel :
0.0 0.2 0.4 0.6 0.8 1.0
5
05
10
EstimationFonction cible
17 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
Generalization to other formsEpanechnikov kernel :
0.0 0.2 0.4 0.6 0.8 1.0
4
2
02
46
8
EstimationFonction cible
17 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
Intervals parametrization
Ik =[mk `k , mk + `k
]
Hierarchical model
Y |X , , , 2,m, ` Nn(1n + X m`
, 2In)
|2 N(0, v02
)` U
(]0, `max]K
), 2 NIGK (,V , a, b) m U
(T K)
17 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
Gibbs sampler : Full conditional distribution
|Y ,X , , 2,m, ` N(0v10 + 1
Tn(Y Xm`
)n + v10
,2
n + v10
)
|Y ,X , , 2,m, ` N
(Xm`
T (Y 1n) + V1,Xm`T Xm` + V
1)
2|Y ,X , , ,m, ` IG
(a +
n + K + 1
2, b2
)
(mk |Y ,X , ,
,
2,mk , `
) exp
{
1
22Y 1n Xm`2} 1 {mk T }
(`k |Y ,X , ,
,
2, `k ,m
) exp
{
1
22Y 1n Xm`2} 1 {`k ]0, `max ]}
where
b2 = b +
1
2
Y 1n Xm`2 + 12v0 ( 0)2 + 12 2V1 .
17 / 17
InterpretableBayesian
Functional LinearRegression
Paul-MarieGrollemund
IntroductionDataFunctionalregressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion
Simulated annealing
Minimize the critera C()
Initialize 0 and 0.
Iterate i from 1 to N : Propose a . Calculate the acceptance probability
= min
{1, exp
(C() C(i1)
i1
)}.
Simulate u U ([0, 1]). If u < , i = (accept)
else i = i+1 (reject)
17 / 17
IntroductionDataFunctional regressionAim
ModelSparsityBayesian modelingInferenceImplementation
ApplicationSimulation studyAgronomic data
Conclusion