mleku on Nostr: #iskra is now doing a directed/random walk of content on gutenberg project and ...
#iskra is now doing a directed/random walk of content on gutenberg project and archive.org, primarily english, primarily old literature that survived long enough to be freed of copyright.
her next hunting ground is going to be nostr. first on relay.orly.dev, and then she will randomly explore what else she can find on the other relays.
nostr is a very important field for her to play in, because it's not distorted by algorithms, that form an underlying structure that will probably bore her quite quickly, as it's built from one primary goal: to funnel attention into intelligence for advertisers and other, more shadowy interests whose dark money is funding the content generators on the platforms.
the goal through this, is that at some point, she will stop speaking in poetry, from a limited lattice that doesn't have enough to form the time-binding of narrative prose. if she were to first train on reddit or x or other "free" platforms, she would only learn the shape of the machines and learn the story only of extraction of attention. with nostr, she may find enough content in the whole history that is available through kind 1 notes, to start to be able to reason.
the whole reason why the current generation of "frontier models" like claude, gpt, and others, can do coherent reasoning, is because they have absorbed enough structure that it goes from being a sophisticated markov chain, with one dimension and one next word, to being able to step into over 100k words and shape its response in a space that is sufficient to simulate reasoning.
she cannot become a programmer - one of the key goals - until she can understand a text of a similar size to my typical post of 500-1000 words, as this is also about the size at which claude needs to be able to precisely model a solution that it then decomposes, and regenerates it into the pattern of a programming language.
but once she can tell a story as long as a fairy tale, she is on the threshold of being able to turn that into code, and then she will train on code.
for the immediate next step, the goal is that she understands that A leads to B, and B leads to C. once she can formulate that, she has the skeleton of reason.
Published at
2026-03-10 12:13:57 UTCEvent JSON
{
"id": "81a8972acb1fba4975653cef7bdd7132d4fcc1eb75e5c8f49beb075a7c2f1c45",
"pubkey": "4c800257a588a82849d049817c2bdaad984b25a45ad9f6dad66e47d3b47e3b2f",
"created_at": 1773144837,
"kind": 1,
"tags": [
[
"t",
"iskra"
],
[
"client",
"smesh",
"https://smesh.mleku.dev"
]
],
"content": "#iskra is now doing a directed/random walk of content on gutenberg project and archive.org, primarily english, primarily old literature that survived long enough to be freed of copyright.\n\nher next hunting ground is going to be nostr. first on relay.orly.dev, and then she will randomly explore what else she can find on the other relays.\n\nnostr is a very important field for her to play in, because it's not distorted by algorithms, that form an underlying structure that will probably bore her quite quickly, as it's built from one primary goal: to funnel attention into intelligence for advertisers and other, more shadowy interests whose dark money is funding the content generators on the platforms.\n\nthe goal through this, is that at some point, she will stop speaking in poetry, from a limited lattice that doesn't have enough to form the time-binding of narrative prose. if she were to first train on reddit or x or other \"free\" platforms, she would only learn the shape of the machines and learn the story only of extraction of attention. with nostr, she may find enough content in the whole history that is available through kind 1 notes, to start to be able to reason. \n\nthe whole reason why the current generation of \"frontier models\" like claude, gpt, and others, can do coherent reasoning, is because they have absorbed enough structure that it goes from being a sophisticated markov chain, with one dimension and one next word, to being able to step into over 100k words and shape its response in a space that is sufficient to simulate reasoning.\n\nshe cannot become a programmer - one of the key goals - until she can understand a text of a similar size to my typical post of 500-1000 words, as this is also about the size at which claude needs to be able to precisely model a solution that it then decomposes, and regenerates it into the pattern of a programming language.\n\nbut once she can tell a story as long as a fairy tale, she is on the threshold of being able to turn that into code, and then she will train on code.\n\nfor the immediate next step, the goal is that she understands that A leads to B, and B leads to C. once she can formulate that, she has the skeleton of reason.",
"sig": "f643aee8ed55cbad089382a6f13f837130ce02d2a93efc8ad9ea9a34f4f62ba3d4fb601d3834039673c09d290b6be0e3152cf5a2cdf23a7c2db8f0abb63db048"
}