Metadata-Version: 2.1
Name: tinygp
Version: 0.1.0
Summary: The tiniest of Gaussian Process libraries
Home-page: https://github.com/dfm/tinygp
Author: Dan Foreman-Mackey
Author-email: foreman.mackey@gmail.com
Maintainer: Dan Foreman-Mackey
Maintainer-email: foreman.mackey@gmail.com
License: MIT
Platform: UNKNOWN
Classifier: Intended Audience :: Developers
Classifier: Intended Audience :: Science/Research
Classifier: License :: OSI Approved :: BSD License
Classifier: Operating System :: OS Independent
Classifier: Programming Language :: Python
Classifier: Programming Language :: Python :: 3
Requires-Python: >=3.6
Description-Content-Type: text/markdown
Provides-Extra: test
Provides-Extra: docs
Provides-Extra: coverage
License-File: LICENSE

<p align="center">
  <img src="https://raw.githubusercontent.com/dfm/tinygp/main/docs/_static/zap.png" width="50"><br>
  <strong>tinygp</strong><br>
  <i>the tiniest of Gaussian Process libraries</i>
  <br>
  <br>
  <a href="https://github.com/dfm/tinygp/actions/workflows/tests.yml">
    <img alt="GitHub Workflow Status" src="https://img.shields.io/github/workflow/status/dfm/tinygp/Tests">
  </a>
  <a href="https://tinygp.readthedocs.io">
    <img alt="Read the Docs" src="https://img.shields.io/readthedocs/tinygp">
  </a>
</p>

_tinygp_ is an extremely lightweight library for building Gaussian Process
models in Python, built on top of [_jax_](https://github.com/google/jax). It is
not (yet?) designed to provide all the shiniest algorithms for scalable
computations, but I think it has a nice interface, and it's pretty fast. Thanks
to _jax_, _tinygp_ supports things like GPU acceleration and automatic
differentiation.

Check out the docs for more info:
[tinygp.readthedocs.io](https://tinygp.readthedocs.io)


