
pogo stick high jump

YouTube Video
Click to view this content.

jet pack racing!
YouTube Video
Click to view this content.
that does sound neat. post some useful examples if you want. controls systems sounds like a interesting application
this is called the distributivity of implication over disjunction in classical propositional logic
Without choice, it's impossible to guarantee that an arbitrary product of non-empty sets is non-empty. The Cartesian product would just be a set containing the empty function, which is a singleton set and therefore non-empty. This definition doesn't rely on the Axiom of Choice, so we can prove in ZF¬C that the empty Cartesian product of non-empty sets is non-empty. So, the question of proving that an empty Cartesian product of non-empty sets is non-empty in ZF¬C does not seem to be problematic. This is the same across all standard set theories (ZF, ZF¬C, ZFC), because it's not about selecting an element from each of the sets in the family (which would require AC), it's just about the definition of the Cartesian product when there are no sets in the family.
Hopf Fibration: Technically, the base space of the Hopf fibration is a 2-sphere (S^2), but since the 2-sphere can be thought of as a compactified plane or a disk with an added point at infinity, it could count.
Möbius Strip: The Möbius strip can be thought of as a fiber bundle over the circle S^1 with fibers that are intervals of the real line.
Twisted Cylinder: Similar to a Möbius strip but with the fibers being open intervals instead of closed loops.
The Klein Bottle: If you take S^1 as your fiber, the Klein bottle can be seen as a nontrivial fiber bundle over the circle.
Principal Bundles: The concept of a principal G-bundle, where G is a topological group, is a generalization of fiber bundles. For instance, the frame bundle of a manifold is a principal GL(n,R)-bundle, where GL(n,R) is the general linear group of invertible matrices, and n is the dimension of the manifold. As a more specific example, consider the tangent bundle of a disk, D2. The frame bundle of D2 is a principal GL(2,R)-bundle over the disk.

high-level air hockey strats

YouTube Video
Click to view this content.
Hydro Thunder(Arcade) The Far East 1:19.46
YouTube Video
Click to view this content.
Super Mario Odyssey Any% (2 Player) in 56:54

Highlights - Cycle Ball - Gold Medal Final | 2021 UCI Indoor Cycling World Championships

YouTube Video
Click to view this content.

Underwater rugby

YouTube Video
Click to view this content.


The winner outran the fastest horse by over 10 minutes in the 22-mile race held in mid Wales.


Milwaukee Senior TT - Highlights | 2023 Isle of Man TT Races

YouTube Video
Click to view this content.

open source math textbooks

A curated list of awesome mathematics resources. Contribute to rossant/awesome-math development by creating an account on GitHub.

some links are broken but otherwise good. Post your open source math textbooks here

After state board approves first taxpayer-funded Catholic school, Hindus seek same | KGOU

As Oklahoma pushes ahead with plans for the first ever taxpayer-funded Catholic public charter school, some say other religions should be included.

cross-posted from: https://lemmy.sdf.org/post/43170

"Prompt Gisting:" Train two models such that given inputs "Translate French
<G1>
<G2>
" and "<G1>
G2>The cat," then G1 and G2 represent the entire instruction.
Prompting is the primary way to utilize the multitask capabilities of language models (LMs), but prompts occupy valuable space in the input context window, and repeatedly encoding the same prompt is computationally inefficient. Finetuning and distillation methods allow for specialization of LMs with...

cross-posted from: https://lemmy.sdf.org/post/36227
Abstract: "Prompting is now the primary way to utilize the multitask capabilities of language models (LMs), but prompts occupy valuable space in the input context window, and re-encoding the same prompt is computationally inefficient. Finetuning and distillation methods allow for specialization of LMs without prompting, but require retraining the model for each task. To avoid this trade-off entirely, we present gisting, which trains an LM to compress prompts into smaller sets of "gist" tokens which can be reused for compute efficiency. Gist models can be easily trained as part of instruction finetuning via a restricted attention mask that encourages prompt compression. On decoder (LLaMA-7B) and encoder-decoder (FLAN-T5-XXL) LMs, gisting enables up to 26x compression of prompts, resulting in up to 40% FLOPs reductions, 4.2% wall time speedups, storage savings, and minimal loss in output quality. "

Taming AI Bots: Prevent LLMs from entering "bad" states using continuous guidance from the LLM ("is this good? bad?") to avoid bad states.

The space of homogeneous probability measures on $\overline{\Gamma \backslash X}_{\max}^{S}$ is compact

NVIDIA's everything 2 anything

YouTube Video
Click to view this content.