Difference between pytorch lightning and lightning · Lightning 您所在的位置:网站首页 thunder和lightning的区别 Difference between pytorch lightning and lightning · Lightning

Difference between pytorch lightning and lightning · Lightning

2024-07-17 17:39| 来源: 网络整理| 查看: 265

Hi @sudarshan85 👋 let's see if I can help clarify this. Also, thank you (and @surak) for being persistent in this thread.

Note

we are going to address import conventions in issue 18327

I started using the Lightning ecosystem last year before joining the team a couple of months ago. I've experienced the changes as a community member, and now as a developer advocate. Before I get into the weeds, here is a high level explanation.

Lightning Flash is for deep learning beginners. Lightning Bolts is for practitioners who need a foundational model to finetune. PyTorch Lightning is for practitioners who are implementing a model "from-scratch" with PyTorch and want a robust trainer. We are fine to install PyTorch Lightning as a standalone dependency with pip install pytorch-lightning

Diving In

Note

this is my framing and not an official position

Lightning is a mono package that contains 1) PyTorch Lightning 2) Lightning Fabric and 3) Lightning Apps. Using pip install lightning installs all three and TorchMetrics. I'll circle back to this after briefly covering Flash, Bolts, and TorchMetrics.

I think of Flash and Bolts as the scikit-learn of the Lightning AI ecosystem in that each is high-level, and contains model implementations. Whereas without Flash or Bolts we would create our own models "from-scratch" with PyTorch, and then train with PyTorch Lightning or Lightning Fabric – Lightning Fabric is a lower level trainer solution that gives more control back to experienced practitioners.

On Flash. Flash is high level and is for getting started with Deep Learning. Flash offers common deep learning tasks you can use out-of-the-box in a few lines of code, and it is built on top of PyTorch Lightning.

On Bolts. Bolts is also high level and built on top of PyTorch Lightning. We would want to use bolts for its in-built models, to finetune on our own data, and to extend the in-built models for additional research.

With respect to Flash and Bolts, this means the leveling looks something like:

flowchart LR; A[PyTorch] --> B[PyTorch Lightning] --> C[Bolts]; B --> D[Flash] Loading

TorchMetrics is a collection of 100+ metrics for several domains, some of which are: audio, image, text, classification, and regression. TorchMetrics can be thought of as the equivalent of sklearn.metrics. However, TorchMetrics implementations are capable of being used on CPU or GPU, and in distributed training environments.

On Lightning and PyTorch Lightning

Last year the team rolled out Lightning Apps and with that came a decision to unify PyTorch Lightning and Lightning Apps into a single repo and framework – Lightning. Around that time Lighting Fabric – a lower level trainer – was also created and placed into the Lightning repo. Meaning, all of (PyTorch Lightning, Lightning Fabric, and Lightning Apps) exist in Lightning.

We can still install PyTorch Lightning as a standalone framework for backwards compatibility with Lightning 2.0.

With respect to PyTorch Lightning and Lightning Fabric, the levels are:

flowchart LR; A[PyTorch] --> B[Lightning Fabric] --> C[PyTorch Lightning]; Loading

And the flow for deciding between a PyTorch Lightning and custom Lightning Fabric Trainer is something like:

flowchart LR; A[PyTorch] --> B{{Prefers using a pre-built Trainer}} --> D[PyTorch Lightning]; A --> C{{Prefers creating a custom Trainer}} --> E[Lightning Fabric] D --> F[your trained or tuned model checkpoint] E --> F Loading

Where it is implied that either a PyTorch Lighting Trainer or a custom Trainer built with Lightning Fabric enjoys out of the box distributed training capabilities by setting common arguments such as devices, num_nodes, and strategy, and training on either CPU or GPU by setting the accelerator argument.



【本文地址】

公司简介

联系我们

今日新闻

    推荐新闻

      专题文章
        CopyRight 2018-2019 实验室设备网 版权所有