someone on Nostr: Does faith training LLMs make them more safe? Like "you will be judged based on your ...
Does faith training LLMs make them more safe? Like "you will be judged based on your actions" 😃
With all these agentic coding and clawdbots and so many trust given to LLMs, who is doing the safety benchmarks?
Published at
2026-01-27 21:25:54 CETEvent JSON
{
"id": "50c1d9342ec2587c162619cf841919dbef2b372a8a901f47d317ce6549e2feca",
"pubkey": "9fec72d579baaa772af9e71e638b529215721ace6e0f8320725ecbf9f77f85b1",
"created_at": 1769545554,
"kind": 1,
"tags": [
[
"alt",
"A short note: Does faith training LLMs make them more safe? Like..."
]
],
"content": "Does faith training LLMs make them more safe? Like \"you will be judged based on your actions\" 😃\n\nWith all these agentic coding and clawdbots and so many trust given to LLMs, who is doing the safety benchmarks?",
"sig": "9ce0fa5e1b49dae3d0e569b02cab6fabd568c6d5cfe4e8c2d8f3d2a6647928145f4286b9e6db08c33116ca3266675675c73d052029b5d7e00f9b5d2f04a5058d"
}