Yann-Aël Le BorgneinTowards Data ScienceOpenAI vs Open-Source Multilingual Embedding ModelsChoosing the model that works best for your dataFeb 2414Feb 2414
Netflix Technology BloginNetflix TechBlogRound 2: A Survey of Causal Inference Applications at NetflixAt Netflix, we want to ensure that every current and future member finds content that thrills them today and excites them to come back for…Jun 66Jun 66
TracyreneeCan you predict the price of bitcoin using a linear regression model?This may very well be my last post on predicting bitcoin prices (for a while anyway) becauseIi have covered several different models that…May 211May 211
Cristian LeoinTowards Data ScienceReinforcement Learning: Deep Q-NetworksTeaching a shuttle to land on the moon using Deep Q-Networks in Python. A mathematical deep dive into Reinforcement Learning.May 231May 231
Manas ChopraAudio Similarity Search Using QdrantUnlocking the Secrets of Spotify’s Recommendation MagicJun 22Jun 22
Patrick DoughertyBuilding AI Agents: Lessons Learned over the past YearBuilding AI Agents over the past year has been a roller coaster… Here’s a little overview of what I’ve learned so far.Jun 322Jun 322
Alberto RomeroTreat AI News Like a River, Not a BucketA quick fix to the overwhelming amount of stuff going on in AIMay 2215May 2215
Torsten WalbauminTowards Data ScienceWhat 10 Years at Uber, Meta and Startups Taught Me About Data AnalyticsAdvice for Data Scientists and ManagersMay 3093May 3093
Fareed KhaninLevel Up CodingBuilding LLaMA 3 From Scratch with PythonCode Your Own Billion Parameter LLMMay 2813May 2813
Vishal RajputinAIGuysPrompt Engineering Is Dead: DSPy Is New Paradigm For PromptingDSPy Paradigm: Let’s program — not prompt — LLMsMay 2940May 2940
Skylar Jean CallisinTowards Data ScienceAttention for Vision Transformers, ExplainedThe Math and the Code Behind Attention Layers in Computer VisionFeb 274Feb 274
Skylar Jean CallisinTowards Data SciencePosition Embeddings for Vision Transformers, ExplainedThe Math and the Code Behind Position Embeddings in Vision TransformersFeb 273Feb 273
Skylar Jean CallisinTowards Data ScienceVision Transformers, ExplainedA Full Walk-Through of Vision Transformers in PyTorchFeb 2710Feb 2710
Skylar Jean CallisinTowards Data ScienceTokens-to-Token Vision Transformers, ExplainedA Full Walk-Through of the Tokens-to-Token Vision Transformer, and Why It’s Better than the OriginalFeb 271Feb 271
Srijanie Dey, PhDinTowards Data ScienceDeep Dive into Sora’s Diffusion Transformer (DiT) by Hand ✍︎Explore the secret behind Sora’s state-of-the-art videosApr 25Apr 25
Srijanie Dey, PhDinTowards Data ScienceDeep Dive into Transformers by Hand ✍︎Explore the details behind the power of transformersApr 127Apr 127
Srijanie Dey, PhDinTowards Data ScienceDeep Dive into Self-Attention by Hand✍︎Explore the intricacies of the attention mechanism responsible for fueling the transformersApr 226Apr 226
Chris Kuo/Dr. DatamaninDataman in AIA Tutorial on the Open-source Lag-Llama for Time Series ForecastingSample eBook chapters (free): https://github.com/dataman-git/modern-time-series/blob/main/20240522beauty_TOC.pdfMar 202Mar 202
Chris Kuo/Dr. DatamaninDataman in AIFrom RNN/LSTM to Temporal Fusion Transformers and Lag-Llamafrom RNN to sequence to sequence learningMar 17Mar 17