-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathindex.html
194 lines (169 loc) · 6.13 KB
/
index.html
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
<!DOCTYPE html>
<html>
<head>
<link rel="stylesheet" href="style.css">
<title>LassoNet: Neural Networks with Feature Sparsity</title>
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/[email protected]/dist/katex.min.css">
<script defer src="https://cdn.jsdelivr.net/npm/[email protected]/dist/katex.min.js">
</script>
<script defer src="https://cdn.jsdelivr.net/npm/[email protected]/dist/contrib/auto-render.min.js">
</script>
<script>
document.addEventListener("DOMContentLoaded", function () {
renderMathInElement(document.body, {
// customised options
// • auto-render specific keys, e.g.:
delimiters: [{
left: '$$',
right: '$$',
display: true
},
{
left: '$',
right: '$',
display: false
},
{
left: '\\(',
right: '\\)',
display: false
},
{
left: '\\[',
right: '\\]',
display: true
}
],
// • rendering keys, e.g.:
throwOnError: false
});
});
</script>
<meta property="og:title" content="LassoNet: Neural Networks with Feature Sparsity " />
<meta http-equiv="Content-Security-Policy"
content="default-src *; style-src 'self' 'unsafe-inline'; script-src 'self' 'unsafe-inline' 'unsafe-eval' https://cdn.jsdelivr.net/">
</head>
<body>
<br>
<h1 class=center>LassoNet: Neural Networks with Feature Sparsity</h1>
<table id=authors class=center>
<tr>
<td>
<a href="https://ismael.lemhadri.org/">Ismael Lemhadri</a>
</td>
<td>
<a href="https://fengruan.github.io/">Feng Ruan</a>
</td>
<td>
<a href="https://louisabraham.github.io/">Louis Abraham</a>
</td>
<td>
<a href="https://statweb.stanford.edu/~tibs/">Rob Tibshirani</a>
</td>
</table>
<table id=links class=center>
<tr>
<td>
<a href='https://jmlr.org/papers/volume22/20-848/20-848.pdf'>[Paper]</a>
</td>
<td>
<a href='https://github.com/ilemhadri/lassonet'>[Code]</a>
</td>
<td>
<a href='./lassonet/api'>[Docs]</a>
</td>
<td>
<a href='https://ismael.lemhadri.org/papers/pdf/lassonet_poster.pdf'>[Poster]</a>
</td>
<td>
<a href='https://ismael.lemhadri.org/papers/pdf/lassonet_slides.pdf'>[Slides]</a>
</td>
</tr>
</table>
<img class="center" src="lassonet/fig1.png" height="300px"></img>
<p>
<b>LassoNet</b> is a method for feature selection in neural networks, to enhance interpretability of the final
network.
<ul>
<li>It uses a novel objective function and learning algorithm, that encourage the network to use only a subset of
the available input features. That is, the resulting network is <b>"feature sparse"</b></li>
<li> This is achieved not by post-hoc analysis of a standard neural network but is <b>built into the objective
function
itself</b>:
<ul>
<li>Input-to-output (skip layer) connections are added to the network with an L1 penalty on its weights</li>
<li>The weight for each feature in this layer acts as an upper bound for all hidden layer weights involving
that feature</li>
</ul>
</li>
<li> The result is an <b>entire path of network solutions</b>, with varying amounts of feature sparsity. This is
analogous to the lasso solution path for linear regression
</li>
</ul>
</p>
<hr>
<h1>LassoNet in 2 minutes</h1>
<iframe src="https://www.youtube.com/embed/bbqpUfxA_OA" frameborder="0"
allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture"
allowfullscreen></iframe>
<hr>
<h1>Installation</h1>
<pre style="text-align: center;">pip install lassonet</pre>
<hr>
<h1>Examples</h1>
<ul>
<li><a href="https://github.com/lasso-net/lassonet/blob/master/examples/miceprotein.py">Mice Protein Expression
dataset
(classification)</a></li>
<li><a href="https://github.com/lasso-net/lassonet/blob/master/examples/boston_housing.py">Boston Housing dataset
(regression)</a></li>
<li><a href="https://github.com/lasso-net/lassonet/blob/master/examples/diabetes.py">Diabetes dataset (regression)
</a></li>
</ul>
<hr>
<div>
<h1>Tips</h1>
LassoNet sometimes require fine tuning. For optimal performance, consider:
<ul>
<li>standardizing the inputs</li>
<li>making sure that the initial dense model (with $\lambda = 0$) has trained well, before starting the LassoNet
regularization
path. This may involve hyper-parameter tuning, choosing the right optimizer, and so on. If the dense model is
underperforming, it is likely that the sparser models will as well.</li>
<li>making sure the stepsize over the $\lambda$ path is not too large. By default, the stepsize runs in geometric
increments until there is no feature left.
</li>
</ul>
</div>
<hr>
<div>
<h1>Intro video</h1>
<iframe src="https://www.youtube.com/embed/G5vPojso9PU" frameborder="0"
allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture"
allowfullscreen></iframe>
<h1>Talk</h1>
<iframe src="https://www.youtube.com/embed/ztGcoMPazwc" frameborder="0"
allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture"
allowfullscreen></iframe>
<br>
</div>
<hr>
<div>
<h1>Citation</h1>
The algorithms and method used in this package came primarily out of research in Rob Tibshirani's lab at Stanford
University. If you use LassoNet in your research we would appreciate a citation to the paper:
<pre>
@article{JMLR:v22:20-848,
author = {Ismael Lemhadri and Feng Ruan and Louis Abraham and Robert Tibshirani},
title = {LassoNet: A Neural Network with Feature Sparsity},
journal = {Journal of Machine Learning Research},
year = {2021},
volume = {22},
number = {127},
pages = {1-29},
url = {http://jmlr.org/papers/v22/20-848.html}
}
</pre>
</div>
</body>
</html>