I’m still closing tabs
98 tabs left. Here are the good ones:
https://www.recipetineats.com/jamaican-rice-and-peas/
https://en.m.wikipedia.org/wiki/Incircle_and_excircles
https://www.gnu.org/licenses/gpl-3.0.en.html
https://www.vincentsitzmann.com/siren/
https://en.m.wikipedia.org/wiki/Ren%C3%A9_Laennec
https://arxiv.org/abs/2309.05682 (A compendium of data sources for data science, machine learning, and artificial intelligence)
https://micthesnare.bandcamp.com/album/vulfmatic
https://topologyguides.com/
https://www.forbes.com/forbes/2010/0927/outfront-netscape-jim-barksdale-daniel-spivey-wall-street-speed-war.html?sh=78448129741a
https://math.stackexchange.com/questions/2825967/feynmantangent-is-the-mean-proportional-between-the-two-parts-of-the-diameter
https://paperswithcode.com/paper/the-reversal-curse-llms-trained-on-a-is-b
https://buttondown.email/hillelwayne/archive/in-defense-of/
https://thephilosophersmeme.com/2019/04/05/the-memetic-bottleneck/
https://nedbatchelder.com/blog/202309/the_myth_of_the_myth_of_learning_styles.html
https://transformer-circuits.pub/2023/privileged-basis/index.html
https://arxiv.org/abs/2310.02207 (Language Models Represent Space and Time)
https://en.m.wikipedia.org/wiki/Hungry_judge_effect
https://arxiv.org/abs/2310.01798 (Large Language Models Cannot Self-Correct Reasoning Yet)
https://huggingface.co/papers/2310.01714 (Large Language Models as Analogical Reasoners)
https://transformer-circuits.pub/2023/monosemantic-features/index.html
https://www.lesswrong.com/posts/aPeJE8bSo6rAFoLqg/solidgoldmagikarp-plus-prompt-generation
https://en.m.wikipedia.org/wiki/Monotropa_uniflora
https://www.pnas.org/doi/10.1073/pnas.1012933107
https://arxiv.org/abs//2310.17191 (How do Language Models Bind Entities in Context?)
https://www.scikit-yb.org/en/latest/api/cluster/silhouette.html
https://arxiv.org/abs/2210.01240 (Language Models Are Greedy Reasoners: A Systematic Formal Analysis of Chain-of-Thought) This one is pretty cool
https://www.lesswrong.com/posts/4Hnso8NMAeeYs8Cta/revealing-intentionality-in-language-models-through-adavae
https://en.m.wikipedia.org/wiki/Jorge_Luis_Borges
https://en.m.wikipedia.org/wiki/Hyperborea
https://plato.stanford.edu/entries/abduction/
https://en.m.wikipedia.org/wiki/Implicature
https://en.m.wikipedia.org/wiki/Ruby_Ridge (I’m from Idaho btw)
https://nlp.seas.harvard.edu/2018/04/03/attention.html#attention-visualization
https://transformer-circuits.pub/2021/framework/index.html
https://en.m.wikipedia.org/wiki/Andes
https://en.m.wikipedia.org/wiki/Ghetto_riots_(1964%E2%80%931969)#Riots
https://www.healthline.com/health/bone-health/whats-the-difference-between-supination-and-pronation
https://en.m.wikipedia.org/wiki/Economic_history_of_Argentina
https://scienceofselfhelp.org/articles-1/2020/11/11/how-to-focus-better-while-meditating
https://blog.datawrapper.de/beautifulcolors/
https://en.m.wikipedia.org/wiki/James%E2%80%93Lange_theory
https://humanaigc.github.io/animate-anyone/ (70 days and they never released the source. I think this was fake, but possibly they chose to make it proprietary)
https://drawabox.com/
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1182327/ (Why Most Published Research Findings Are False)
https://deepmind.google/discover/blog/funsearch-making-new-discoveries-in-mathematical-sciences-using-large-language-models/
https://datascience.stackexchange.com/questions/79568/how-do-i-know-what-the-best-number-of-layers-is-required-to-achieve-the-highest
https://www.amazon.science/blog/do-large-language-models-really-need-all-those-layers
https://www.whizzy.org/2023-12-14-bricked-xmas/
https://figgie.com/how-to-play
https://jalammar.github.io/illustrated-gpt2/
https://www.lesswrong.com/posts/c6uTNm5erRrmyJvvD/mapping-the-semantic-void-strange-goings-on-in-gpt-embedding
https://arxiv.org/abs/2312.10794 (A mathematical perspective on Transformers)
https://lwn.net/SubscriberLink/955376/b3fba3bbfabbc411/ (The Linux graphics stack in a nutshell, part 1)
https://www.robkhenderson.com/p/the-machiavellian-maze
https://americanoutdoorschool.thinkific.com/ (Free comprehensive wilderness first aid course)
https://en.m.wikipedia.org/wiki/Open-source_cola
https://huggingface.co/spaces/coqui/xtts
https://www.sciencedirect.com/science/article/abs/pii/S0306261921011673?ref=collapsemusings.com (Peak oil)
https://lighterpack.com/
https://blog.tanyakhovanova.com/2023/12/four-sheep/
https://www.newsinslowspanish.com/home/news/intermediate
https://sajforbes.nz/languageguide/resources/
https://mermaid.js.org/#/
https://en.m.wikipedia.org/wiki/Tears_of_the_Turtle_Cave
https://www.gamedeveloper.com/programming/1500-archers-on-a-28-8-network-programming-in-age-of-empires-and-beyond#close-modal
https://critpoints.net/2018/02/18/good-fps-map-design/
https://calv.info/heat-pumps
Three tabs left open on my phone. I feel free.