Lab Home | Phone | Search
Center for Nonlinear Studies  Center for Nonlinear Studies
 Colloquia Archive 
 Postdoc Seminars Archive 
 Quantum Lunch 
 CMS Colloquia 
 Q-Mat Seminars 
 Q-Mat Seminars Archive 
 Kac Lectures 
 Dist. Quant. Lecture 
 Ulam Scholar 
 Summer Research 
 Past Visitors 
 History of CNLS 
 Maps, Directions 
 CNLS Office 
Jason K. Johnson

Director Funded Postdoctoral Fellow

Information Theory, Statistical Physics and Graphical Models

Jason K. Johnson

Office: TA-3, Bldg 1690, Room 138
Mail Stop: B258
Phone: (505) 665-7816
Fax: (505) 665-2659
home page

Research highlight
  • Majorization-minimization approach to design of power transmission networks. Mini-workshop on optimization and control theory for smart grids, Los Alamos NM, August 2010. slides
  • Orbit-product analysis of generalized gaussian belief propagation. Physics of algorithms workshop, Santa Fe NM, September 2009. slides
 Educational Background/Employment:
  • Ph.D Electrical Engineering and Computer Science (2008), MIT.
  • S.M. Electrical Engineering and Computer Science (2003), MIT.
  • S.B. Physics (1995), MIT.
  • Employment:
    • 2008-Present Director's Postdoctoral Fellow, LANL.
    • 1995-2000 Member of Technical Staff, Alphatech, Inc.

Research Interests:

  • Inference and learning in graphical models.
  • Information theory and statistical physics.
  • Statistical signal and image processing.
  • Design and control of electric power networks.
  • Multiscale methods.

Selected Recent Publications:

  1. S. Kudekar, JKJ, M. Chertkov (MC). Linear Programming Decoding using Frustrated Cycles. To Appear at Information Theory Workshop (ITW) 2011.
  2. JKJ, P. Netrapalli, MC. Learning Planar Ising Models. arxiv
  3. S. Kudekar, JKJ, MC. Linear programming based detectors for two-dimensional intersymbol interference channels., To appear at IEEE International Symposium on Information Theory, July 2011. arxiv
  4. JKJ, MC. A Majorization-Minimization Approach to Design of Power Transmission Networks, In Proc. of 49th IEEE Conference on Decision and Control, December 2010. arxiv
  5. JKJ, V. Chernyak, MC. Orbit-Product Representation and Correction of Gaussian Belief Propagation. In Proc. of Inter. Conf. on Machine Learning, June 2009. arxiv
  6. JKJ, D. Bickson, D. Dolev. Fixing Convergence of Gaussian Belief Propagation. In Proc. of Inter. Symposium on Information Theory, July 2009. arxiv
  7. JKJ, Convex Relaxation Methods for Graphical Models: Lagrangian and Maximum Entropy Approaches. Ph.D Thesis. MIT, August 2008. mit
  8. JKJ, A. Willsky (AW). A Recursive Model-Reduction Method for Estimation in Gaussian Markov Random Fields. IEEE Trans. on Image Processing, January 2008. ieee
  9. D. Malioutov (DM), JKJ, M. Choi, AW. Low-rank variance approximation in GMRF Models: single and multiscale approaches. IEEE Transactions on Signal Processing (TSP), October 2008. ieee
  10. V. Chandrasekaran (VC), JKJ, AW. Estimation in Gaussian graphical models using tractable sub-graphs: a walk-sum analysis. TSP, May 2008. ieee
  11. VC, JKJ, AW. Adaptive Embedded Subgraph Algorithms using Walk-Sum Analysis., In. Proc. of Adv. Neural Information Processing (NIPS), 2007. nips
  12. JKJ, DM, AW. Lagrangian relaxation method for MAP estimation in graphical models. In Proc. of Allerton Conf. on Communication, Control and Computing, September 2007. arxiv
  13. JKJ, VC, AW. Learning Markov Structure by Maximum Entropy Relaxation. In AISTATS, March 2007. jmlr
  14. DM, JKJ, AW. Walk-sums and belief propagation in Gaussian graphical models. Journal of Machine Learning Research, October 2006. jmlr
  15. JKJ, DM, AW. Walk-Sum Interpretation and Analysis of Gaussian Belief Propagation. NIPS, December 2005. nips
LANL Operated by the Los Alamos National Security, LLC for the National Nuclear Security Administration of the US Department of Energy.
Copyright © 2003 LANS, LLC | Disclaimer/Privacy