Giving Back to the Community
Our Open Source Initiatives
We believe in the power of open source to accelerate innovation and democratize access to cutting-edge technology. Our initiatives focus on empowering researchers, developers, and data scientists with tools that make advanced machine learning more accessible and efficient, particularly in the fields of protein analysis and chemical property prediction.
Community Impact
0
Total Stars
0
Total Forks
0
Contributors
0
Active Projects
FastPLMs
Efficient protein language models with Flash/Flex attention. Open-source effort to increase the efficiency of pretrained protein language models.
0
0
0
Key Features
Flash/Flex Attention: Replaces native attention for improved efficiency
Huggingface Integration: All models can be loaded via AutoModel from Huggingface transformers
Dataset Embedding: Embed entire datasets with no new code using embed_dataset
Optimized Processing: Sequences are sorted to reduce padding tokens for faster processing
Why FastPLMs?
Efficient PLM Handling: Optimized for loading, fine-tuning, and inference with large language models
Advanced Optimization: Built-in techniques for model quantization, pruning, and more
Scalability: Designed for handling large datasets and models
Community Driven: Enables researchers and developers to build sophisticated AI applications
Join Our Open Source Community
Interested in contributing to our open source projects or exploring collaboration opportunities? We'd love to hear from you and discuss how we can work together to advance the field of machine learning.