•
|
B. Balle and G. Rabusseau.
Approximate minimization of weighted tree automata.
Information and Computation, 282:104654, 2022.
[ bib |
pdf ]
|
•
|
T. Li, D. Precup, and G. Rabusseau.
Connecting weighted automata, tensor networks and recurrent neural
networks through spectral learning.
Machine Learning, 2022.
[ bib |
arXiv ]
|
•
|
B. Mazoure, T. Doan, T. Li, V. Makarenkov, J. Pineau, D. Precup, and
G. Rabusseau.
Low-rank representation of reinforcement learning policies.
Journal of Artificial Intelligence Research, 2022.
[ bib ]
|
•
|
V. Makarenkov, B. Mazoure, G. Rabusseau, and P. Legendre.
Horizontal gene transfer and recombination analysis of sars-cov-2
genes helps discover its close relatives and shed light on its origin.
BMC Ecology and Evolution, 21(1):1--18, 2021.
[ bib ]
|
•
|
R. Bailly, G. Rabusseau, and F. Denis.
Recognizable series on graphs and hypergraphs.
Journal of Computer and System Sciences, 104:58 -- 81, 2019.
[ bib |
preprint |
pdf ]
|
•
|
V. François-Lavet, G. Rabusseau, J. Pineau, D. Ernst, and R. Fonteneau.
On overfitting and asymptotic bias in batch reinforcement learning
with partial observability.
Journal of Artificial Intelligence Research, 65:1--30, 2019.
[ bib |
arXiv ]
|
•
|
S. Huang, J. Danovitch, G. Rabusseau, and R. Rabbany.
Fast and attributed change detection on dynamic graphs with density
of states.
In PAKDD, 2023.
[ bib |
arXiv ]
|
•
|
S. Huang, F. Poursafaei, J. Danovitch, M. Fey, W. Hu, E. Rossi, J. Leskovec,
M. Bronstein, G. Rabusseau, and R. Rabbany.
Temporal graph benchmark for machine learning on temporal graphs.
In NeurIPS, 2023.
[ bib |
arXiv ]
|
•
|
C. Hua, G. Rabusseau, and J. Tang.
High-order pooling for graph neural networks with tensor
decomposition.
In NeurIPS, 2022.
[ bib |
arXiv ]
|
•
|
J. Miller, G. Rabusseau, and J. Terilla.
Tensor networks for probabilistic sequence modeling.
In AISTATS, 2021.
[ bib |
arXiv ]
|
•
|
T. Doan, M. Bennani, B. Mazoure, G. Rabusseau, and P. Alquier.
A theoretical analysis of catastrophic forgetting through the NTK
overlap matrix.
In AISTATS, 2021.
[ bib |
arXiv ]
|
•
|
S. Srinivasan, S. Adhikary, J. Miller, G. Rabusseau, and B. Boots.
Quantum tensor networks, stochastic processes, and weighted automata.
In AISTATS, 2021.
[ bib |
arXiv ]
|
•
|
B. Balle, C. Lacroce, P. Panangaden, D. Precup, and G. Rabusseau.
Optimal spectral-norm approximate minimization of weighted finite
automata.
In ICALP, 2021.
[ bib |
arXiv ]
|
•
|
S. Huang, V. François-Lavet, and G. Rabusseau.
Understanding capacity saturation in incremental learning.
In 34th Canadian Conference on Artificial Intelligence, 2021.
[ bib ]
|
•
|
C. Lacroce, P. Panangaden, and G. Rabusseau.
Extracting weighted automata for approximate minimization in language
modelling.
In ICGI, 2021.
[ bib |
arXiv ]
|
•
|
B. Khavari and G. Rabusseau.
Lower and upper bounds on the pseudo-dimension of tensor network
models.
In NeurIPS, 2021.
[ bib |
arXiv ]
|
•
|
T. Li, B. Mazoure, D. Precup, and G. Rabusseau.
Efficient planning under partial observability with unnormalized Q
functions and spectral learning.
AISTATS, 2020.
[ bib |
arXiv ]
|
•
|
B. Rakhshan and G. Rabusseau.
Tensorized random projections.
AISTATS, 2020.
[ bib |
code |
arXiv ]
|
•
|
S. Huang, Y. Hitti, G. Rabusseau, and R. Rabbany.
Laplacian change point detection for dynamic graphs.
In KDD, 2020.
[ bib |
arXiv ]
|
•
|
G. Rabusseau, T. Li, and D. Precup.
Connecting weighted automata and recurrent neural networks through
spectral learning.
In AISTATS, 2019.
[ bib |
arXiv |
pdf |
poster |
slides ]
|
•
|
G. Rabusseau.
Minimization of graph weighted models over circular strings.
In FoSSaCS. 2018.
[ bib |
pdf |
slides ]
|
•
|
T. Li, G. Rabusseau, and D. Precup.
Nonlinear weighted finite automata.
In AISTATS. 2018.
[ bib |
pdf ]
|
•
|
P. Amortila and G. Rabusseau.
Learning graph weighted models on pictures.
In ICGI, 2018.
[ bib |
arXiv |
pdf ]
|
•
|
M. Ruffini, G. Rabusseau, and B. Balle.
Hierarchical methods of moments.
In NeurIPS. 2017.
[ bib |
pdf ]
|
•
|
G. Rabusseau, B. Balle, and J. Pineau.
Multitask spectral learning of weighted automata.
In NeurIPS. 2017.
[ bib |
pdf |
poster |
slides ]
|
•
|
G. Rabusseau, B. Balle, and S. B. Cohen.
Low-Rank Approximation of Weighted Tree Automata.
In AISTATS, 2016.
[ bib |
arXiv |
pdf |
poster ]
|
•
|
G. Rabusseau and H. Kadri.
Low-rank regression with tensor responses.
In NeurIPS. 2016.
[ bib |
code |
pdf |
poster ]
|
•
|
R. Bailly, F. Denis, and G. Rabusseau.
Recognizable Series on Hypergraphs.
In LATA, 2015.
[ bib |
arXiv |
slides ]
|
•
|
G. Rabusseau and F. Denis.
Maximizing a Tree Series in the Representation Space.
In ICGI, 2014.
[ bib |
pdf |
slides ]
|
•
|
F. Heidari, P. Taslakian, and G. Rabusseau.
Explaining graph neural networks using interpretable local
surrogates.
In Topological, Algebraic and Geometric Learning Workshops
2023, pages 146--155. PMLR, 2023.
[ bib ]
|
•
|
M. Gamal and G. Rabusseau.
Rosa: Random orthogonal subspace adaptation.
In Workshop on Efficient Systems for Foundation Models@
ICML2023, 2023.
[ bib ]
|
•
|
K. Hou and G. Rabusseau.
Spectral regularization: an inductive bias for sequence modeling.
LearnAut workshop at ICALP 2022, 2022.
[ bib ]
|
•
|
M. Lizaire, S. Verret, and G. Rabusseau.
Spectral initialization of recurrent neural networks: Proof of
concept.
LearnAut workshop at ICALP 2022, 2022.
[ bib ]
|
•
|
T. Li, B. Mazoure, and G. Rabusseau.
Sequential density estimation via ncwfas sequential density
estimation via nonlinear continuous weighted finite automata.
LearnAut workshop at ICALP 2022, 2022.
[ bib ]
|
•
|
C. Lacroce, P. Panangaden, and G. Rabusseau.
Towards an aak theory approach to approximate minimization in the
multi-letter case.
LearnAut workshop at ICALP 2022, 2022.
[ bib ]
|
•
|
A. Huang, K.-C. Wang, G. Rabusseau, and A. Makhzani.
Few shot image generation via implicit autoencoding of support sets.
In Fifth Workshop on Meta-Learning at the Conference on Neural
Information Processing Systems, 2021.
[ bib ]
|
•
|
B. T. Rakhshan and G. Rabusseau.
Rademacher random projections with tensor networks.
In Second Workshop on Quantum Tensor Networks in Machine
Learning In conjunction with NeurIPS, 2021.
[ bib ]
|
•
|
S. Srinivasan, S. Adhikary, J. Miller, B. Pokharel, G. Rabusseau, and B. Boots.
Towards a trace-preserving tensor network representation of quantum
channels.
In Second Workshop on Quantum Tensor Networks in Machine
Learning In conjunction with NeurIPS, 2021.
[ bib ]
|
•
|
P. Amortila and G. Rabusseau.
Learning graph weighted models on pictures.
2nd workshop on Learning and Automata (LearnAut at FLoC 2018),
2018.
[ bib ]
|
•
|
S. Huang, V. François-Lavet, G. Rabusseau, and J. Pineau.
Exploring continual learning using incremental architecture search.
NeurIPS 2018 Workshop on Continual Learning, 2018.
[ bib |
pdf ]
|
•
|
D. Wu, G. Rabusseau, V. François-lavet, D. Precup, and B. Boulet.
Optimizing home energy management and electric vehicle charging with
reinforcement learning.
Adaptive Learning Agents (ALA) workshop at the Federated AI
Meeting, 2018.
[ bib |
pdf ]
|
•
|
T. Li, G. Rabusseau, and D. Precup.
Neural network based nonlinear weighted finite automata.
LICS workshop on Learning and Automata, 2017.
[ bib ]
|
•
|
G. Rabusseau and J. Pineau.
Multitask spectral learning of weighted automata.
LICS workshop on Learning and Automata, 2017.
[ bib ]
|
•
|
R. Bailly and G. Rabusseau.
Graph learning as a tensor factorization problem.
NIPS workshop on Learning with Tensors, 2016.
[ bib |
pdf ]
|
•
|
G. Rabusseau and F. Denis.
Learning Negative Mixture Models by Tensor Decompositions.
Workshop on Method of Moments and Spectral Learning (ICML
2014), 2014.
[ bib |
poster ]
|