Alessandro Achille PhD student

University of California, Los Angeles
achille at cs.ucla.edu
alexachi

I am a PhD candidate at the Computer Science Department of UCLA, working with Prof. Stefano Soatto in the Vision Lab. During my PhD I have also been a research scientist intern at Deep Mind and AmazonAI. My research interests include representation learning,information theory, variational inference, deep learning and their applications to computer vision.

Before coming to UCLA, I obtained a Master in Pure Math at the Scuola Normale Superiore and the University of Pisa, where I studied model theory, algebraic topology, and their intersection with Prof. Alessandro Berarducci. During that period, I have also been a visiting student at the University of Leeds Math department.

Teaching

CS103 at Caltech: Topics in Representation Learning, Information Theory and Control

Publications

  • A. Achille, M. Rovere, S. Soatto
    Critical Learning Periods in Deep Neural Networks
    International Conference on Learning Representations (ICLR) 2019
    @inproceedings{
    achille2018critical,
    title={Critical Learning Periods in Deep Networks},
    author={Alessandro Achille and Matteo Rovere and Stefano Soatto},
    booktitle={International Conference on Learning Representations},
    year={2019},
    url={https://openreview.net/forum?id=BkeStsCcKQ},
    }
    
  • A. Achille, T. Eccles, L. Matthey, C.P. Burgess, N. Watters, A. Lerchner, I. Higgins
    Life-Long Disentangled Representation Learning with Cross-Domain Latent Homologies
    Neural Information Processing Systems 31 (NeurIPS), 2018
    @incollection{NIPS2018_8193,
    title = {Life-Long Disentangled Representation Learning with Cross-Domain Latent Homologies},
    author = {Achille, Alessandro and Eccles, Tom and Matthey, Loic and Burgess, Chris and Watters, Nicholas and Lerchner, Alexander and Higgins, Irina},
    booktitle = {Advances in Neural Information Processing Systems 31},
    editor = {S. Bengio and H. Wallach and H. Larochelle and K. Grauman and N. Cesa-Bianchi and R. Garnett},
    pages = {9895--9905},
    year = {2018},
    publisher = {Curran Associates, Inc.},
    url = {http://papers.nips.cc/paper/8193-life-long-disentangled-representation-learning-with-cross-domain-latent-homologies.pdf}
    }
    
  • A. Achille, S. Soatto
    A Separation Principle for Control in the Age of Deep Learning
    Annual Reviews of Control, Robotics and Autonomous Systems, 2018
    @article{achille2017separation,
        author = { Alessandro  Achille and  Stefano  Soatto},
        title = {A Separation Principle for Control in the Age of Deep Learning},
        journal = {Annual Review of Control, Robotics, and Autonomous Systems},
        volume = {1},
        number = {1},
        pages = {null},
        year = {2018},
        doi = {10.1146/annurev-control-060117-105140},
    
        URL = {
            https://doi.org/10.1146/annurev-control-060117-105140
    
        },
        eprint = {
            https://doi.org/10.1146/annurev-control-060117-105140
        }
    }
    
  • A. Achille, S. Soatto
    Emergence of Invariance and Disentangling in Deep Representations
    Journal of Machine Learning Research (JMLR), 2018
    @article{JMLR:v19:17-646,
      author  = {Alessandro Achille and Stefano Soatto},
      title   = {Emergence of Invariance and Disentanglement in Deep Representations },
      journal = {Journal of Machine Learning Research},
      year    = {2018},
      volume  = {19},
      number  = {50},
      pages   = {1-34},
      url     = {http://jmlr.org/papers/v19/17-646.html}
    }
    
  • A. Achille, S. Soatto
    Information Dropout: learning optimal representations through noisy computation
    Transactions on Pattern Analysis and Machine Intelligence (PAMI)
    @ARTICLE{achille2018information,
        author={A. Achille and S. Soatto},
        journal={IEEE Transactions on Pattern Analysis and Machine Intelligence},
        title={Information Dropout: Learning Optimal Representations Through Noisy Computation},
        year={2018},
        volume={PP},
        number={99},
        pages={1-1},
        keywords={Bayes methods;Information theory;Machine learning;Neural networks;Noise measurement;Training;Representation learning;deep learning;information bottleneck;invariants;minimality;nuisances},
        doi={10.1109/TPAMI.2017.2784440},
        ISSN={0162-8828},
    month={},}
    }
    
  • A. Achille, A. Berarducci
    A Vietoris-Smale mapping theorem for the homotopy of hyperdefinable sets
    Selecta Mathematica
    @article{achille2018a,
      author = {Achille, Alessandro and Berarducci, Alessandro},
      year = {2018},
      title = {A Vietoris-Smale mapping theorem for the homotopy of hyperdefinable sets},
      journal = {Selecta Mathematica},
      issn = {1022-1824},
      doi = {10.1007/s00029-018-0413-3},
      month = {4},
      pages = {1--29},
      url = {http:https://doi.org/10.1007/s00029-018-0413-3},
      abstract = {Results of Smale (1957) and Dugundji (1969) allow to compare the homotopy groups of two topological spaces X and Y whenever a map f:X→Y with strong connectivity conditions on the fibers is given. We can apply similar techniques to compare the homotopy of spaces living in different categories, for instance an abelian variety over an algebraically closed field, and a real torus. More generally, working in o-minimal expansions of fields, we compare the o-minimal homotopy of a definable set X with the homotopy of some of its bounded hyperdefinable quotients X/E. Under suitable assumption, we show that pi_n^def(X)=pi_n(X/E) and dim(X)=dim_R(X/E). As a special case, given a definably compact group, we obtain a new proof of Pillay's group conjecture dim(G)=dim_R(G/G00)€ largely independent of the group structure of G. We also obtain different proofs of various comparison results between classical and o-minimal homotopy.}
    }
    

Earlier work

  • A. Achille
    On definable groups in o-minimal and NIP settings
    MSc thesis, University of Pisa, 2015