Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.
Any awful.systems sub may be subsneered in this subthread, techtakes or no.
If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.
The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)
Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.
(Credit and/or blame to David Gerard for starting this.)
Well, i'm not exactly sure about it, but what i can do is describe how this process works in terms of operations and you can draw your own conclusions. There's not that much complexity in peptides in the first place, because synthetically, all you have to do is to make a lot of amide bonds, and this is a solved problem. Slightly bigger problem is to make it in controlled way, which is reason why protecting groups are used, but this is also a thing that has been around for decades.
The trick is to bind the thing you want to get to resin, which makes it always insoluble and therefore your product always stays in reactor. This can be an actual dedicated automated reactor or a syringe with a filter. We start with
Resin-NH-Fmoc
Fmoc is a protecting group that falls off when flushed with a base, so we do so, wash resin, and get
Resin-NH2
Then we can add coupling agent and protected aminoacid, for example leucine, then wash again, and this gets us
Resin-NH-Leu-Fmoc
then repeat. All operations are add reagent or wash solvent, stir, wait, drain, repeat. Deprotection, solvent, coupling, solvent, repeat. It's all very amenable to automation and it was explicitly designed this way. When all is done, resin is treated with acid which releases peptide, and because resin can be washed there are no leftover reagents.
Of course it can be all done by hand, and this allows for doing things like putting a couple of aminoacids on resin on a big scale, then splitting it into a couple of batches and attaching different things on top of that in parallel. Machine can't do this (at least machine like we have). Machine can instead run all of this while hot and this makes it fast, but sometimes things break this way, and also machine can run unattended for more than one shift (when it's not broken). Sometimes things fail to work anyway and it's a job of specialist to figure it out and unfuck it up. Sometimes peptide folds on itself in such a way that -NH2 end is hidden inside and next residue can't be attached. This can be fixed by gluing two aminoacids in flask and then using a pair instead of a single one in machine, bypassing that problematic step. Or in a couple of other different ways, and picking the right one requires knowing what are you doing.
Solution phase synthesis looks different because every step requires purification after it, which is sometimes a thing you can wing and sometimes not. The advantage is that when you need lots of product, you can just use bigger flasks, while bigger machine (and large amounts of resin) gets prohibitively expensive. Ozempic was made in solution (at least once) for example. Again, doing things by hand gets you extra flexibility, because machine can only make peptide from start to finish in single run, but if it's done in solution instead, you can start from, say, five points and then put pieces together (which starts to look like convergent synthesis, and this also makes it better for large scale). Machine can't do that (unless these pieces are provided, but at this point most of the work is done)
Looking back at these people, even when operations are simplified, there's no deskilling of operators that they aimed for, it's just throughput that increases. They also don't have the benefit of that "keeping the important things always in reactor" thing