Jason K. JohnsonDirector Funded Postdoctoral Fellow CNLS/T-4 Information Theory, Statistical Physics and Graphical Models 
Office: TA-3, Bldg 1690, Room 138 Mail Stop: B258 Phone: (505) 665-7816 Fax: (505) 665-2659 jasonj@lanl.gov home page Research highlight
- Majorization-minimization approach to design of power
transmission networks. Mini-workshop on optimization and control
theory for smart grids, Los Alamos NM, August 2010.
slides
- Orbit-product analysis of generalized gaussian belief
propagation. Physics of algorithms workshop, Santa Fe NM, September 2009.
slides
|  | Educational Background/Employment:- Ph.D Electrical Engineering and Computer Science (2008), MIT.
- S.M. Electrical Engineering and Computer Science (2003), MIT.
- S.B. Physics (1995), MIT.
- Employment:
- 2008-Present Director's Postdoctoral Fellow, LANL.
- 1995-2000 Member of Technical Staff, Alphatech, Inc.
Research Interests: - Inference and learning in graphical models.
- Information theory and statistical physics.
- Statistical signal and image processing.
- Design and control of electric power networks.
- Multiscale methods.
Selected Recent Publications:
- S. Kudekar, JKJ, M. Chertkov (MC).
Linear Programming Decoding using Frustrated Cycles.
To Appear at Information Theory Workshop (ITW) 2011.
- JKJ, P. Netrapalli, MC. Learning Planar Ising
Models. arxiv
- S. Kudekar, JKJ, MC. Linear programming based
detectors for two-dimensional intersymbol interference channels., To appear
at IEEE International Symposium on Information Theory, July 2011.
arxiv
- JKJ, MC. A Majorization-Minimization
Approach to Design of Power Transmission Networks, In Proc. of 49th IEEE
Conference on Decision and Control, December 2010.
arxiv
- JKJ, V. Chernyak, MC.
Orbit-Product Representation and Correction of Gaussian Belief Propagation.
In Proc. of Inter. Conf. on Machine Learning, June 2009.
arxiv
- JKJ, D. Bickson, D. Dolev.
Fixing Convergence of Gaussian Belief Propagation.
In Proc. of Inter. Symposium on Information Theory, July 2009.
arxiv
- JKJ, Convex Relaxation Methods for
Graphical Models: Lagrangian and Maximum Entropy Approaches.
Ph.D Thesis. MIT, August 2008.
mit
- JKJ, A. Willsky (AW). A Recursive
Model-Reduction Method for Estimation in Gaussian Markov Random
Fields. IEEE Trans. on Image Processing, January 2008.
ieee
- D. Malioutov (DM), JKJ, M. Choi, AW. Low-rank variance approximation
in GMRF Models: single and multiscale approaches. IEEE Transactions on
Signal Processing (TSP), October 2008.
ieee
- V. Chandrasekaran (VC), JKJ, AW. Estimation in Gaussian
graphical models using tractable sub-graphs: a walk-sum analysis.
TSP, May 2008.
ieee
- VC, JKJ, AW. Adaptive Embedded Subgraph
Algorithms using Walk-Sum Analysis., In. Proc. of Adv. Neural Information Processing (NIPS), 2007.
nips
- JKJ, DM, AW.
Lagrangian relaxation method for MAP estimation in graphical
models. In Proc. of Allerton Conf. on Communication, Control
and Computing, September 2007.
arxiv
- JKJ, VC, AW.
Learning Markov Structure by Maximum Entropy Relaxation. In
AISTATS, March 2007.
jmlr
- DM, JKJ, AW. Walk-sums and belief
propagation in Gaussian graphical models. Journal of Machine
Learning Research, October 2006.
jmlr
- JKJ, DM, AW.
Walk-Sum Interpretation and Analysis of Gaussian Belief
Propagation. NIPS, December 2005.
nips
|