Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Cyborg)
  • No Skin
Collapse
Brand Logo

CIRCLE WITH A DOT

kentpitman@climatejustice.socialK

kentpitman@climatejustice.social

@kentpitman@climatejustice.social
About
Posts
17
Topics
1
Shares
0
Groups
0
Followers
0
Following
0

View Original

Posts

Recent Best Controversial

  • Can't wait to use and promote illegal operating systems that do not verify age.
    kentpitman@climatejustice.socialK kentpitman@climatejustice.social

    @catsalad

    They'll be able to tell your age by the fact that you care about this issue. 😉

    Uncategorized

  • @screwlisp is having some site connectivity problems so asked me to remind everyone that we'll be on the anonradio forum at the top of the hour (a bit less than ten minutes hence) for those who like that kind of thing:
    kentpitman@climatejustice.socialK kentpitman@climatejustice.social

    @dougmerritt @djl @wrog @ramin_hal9001 @screwlisp @cdegroot

    For those looking on who might not know these terms, teletypes had paper feeding through and mostly did only output that was left-to-right and then fed that line and then did not back up ever to a previous line. They were also loud and clunky, mostly, and had keyboards that had keys you had to press way down in order to get them to take.

    Glass terminals were displays that could only do output to the bottom line of the screen, kind of like a paper terminal but without the paper. Once it scrolled up, you couldn't generally scroll back down. But that's why it might sound like it would have cursor control but did not yet.

    Uncategorized lispygopher gopher lisp commonlisp

  • @screwlisp is having some site connectivity problems so asked me to remind everyone that we'll be on the anonradio forum at the top of the hour (a bit less than ten minutes hence) for those who like that kind of thing:
    kentpitman@climatejustice.socialK kentpitman@climatejustice.social

    @wrog @dougmerritt @ramin_hal9001 @screwlisp @cdegroot

    Funny, I couldn't recall "~" being important at all so had to go check. See https://codeberg.org/PDP-10/its/src/branch/master/doc/_teco_/tecord.1132 and while I do see a few uses of it, they seem very minor.

    I read this into an Emacs editor buffer and did "M-x occur" looking for [~] and got these, all of which seem highly obscure. I think it is probably because in the early days there may have been a desire not to have case matter, so the upper and lower case versions of these special characters (see line 2672 below) may have once been equivalent or might have some reason to want to reserve space to be equivalent in some cases. Remember that, for example, on a VT52, he CTRL key did not add a control bit but masked out all the bits beyond the 5th, so that CTRL+@ and CTRL+Space were the same (null) character, for example. And sometimes tools masked out the 7th bit in order to uppercase something, which means that certain characters like these might have in some cases gotten blurred.

    10 matches for "[~]" in buffer: tecord.1132
    1270: use a F~ to compare the error message string against a
    2017: case special character" (one of "`{|}~<rubout>").
    2235: the expected ones, with F~.
    2370: kept in increasing order, as F~ would say, or FO's binary
    2672: also ("@[\]^_" = "`{|}~<rubout>").
    4192:F~ compares strings, ignoring case difference. It is just
    4446: this option include F^A, F^E, F=, FQ, F~, G and M.
    4942: string storage space, but begins with a "~" (ASCII 176)
    4977: character should be the rubout beginning a string or the "~"
    4980: "~" or rubout, then it is not a pointer - just a plain number.

    If I recall correctly, this also meant in some tools it was possible if you were using a control-prefix con CTRL-^ to have CTRL-^ CTRL-@ be different than CTRL-^ @ because one of them might set the control bit on @ and the other on null, so there was a lot of ailasing. It even happened for regular characters that CTRL-^ CTRL-A would get you a control bit set on #o1 while CTRL+^ A would get you the control bit set on 65. Some of these worked very differently on the Knight TV, which used SAIL characters, I think, and which thought a code like 1 was an uparrow, not a control-A. There were a lot of blurry areas, and it was hell on people who wanted to make a Dvorak mode because it was the VT52 (and probably VT100 and AAA) hardware that was doing this translation, so there was no place to software intercept all this and make it different, so that's probably why something as important as Teco treaded lightly on making some case distinctions.

    But if someone remembers, better, please let me know. It's been 4+ decades since I used this stuff a lot and details slip away. It's just that these things linger, I think, because they were so important to realize were live rails not to tread upon. And because I did, for a while, live and breathe this stuff, since I wrote a few TECO libraries (like ZBABYL and the original TeX mode), so I guess practice drills it in, too.

    Uncategorized lispygopher gopher lisp commonlisp

  • @screwlisp is having some site connectivity problems so asked me to remind everyone that we'll be on the anonradio forum at the top of the hour (a bit less than ten minutes hence) for those who like that kind of thing:
    kentpitman@climatejustice.socialK kentpitman@climatejustice.social

    @wrog @dougmerritt @ramin_hal9001 @screwlisp @cdegroot

    It's a bit low tech but if you noticed it in time that other people don't have a ton of other stuff attached to it, just save the text, delete the old post, attach the new. Someone could make that be a single operation in a client and even have it send mail to the people who attached replies saying here's your text if you want to attach it to the new post. Or you could attach your own post with their text in it. Low-tech as it is, existing tools offer us a lot more options than sometimes people see. I'm sure you could have figured this out, and are more fussing at the tedium, but just for fun I'm going to cross reference a related but different scenario...

    https://web.archive.org/web/20100921050348/http://open.salon.com/blog/kent_pitman/2010/09/18/the_cornfield_explained

    Uncategorized lispygopher gopher lisp commonlisp

  • @screwlisp is having some site connectivity problems so asked me to remind everyone that we'll be on the anonradio forum at the top of the hour (a bit less than ten minutes hence) for those who like that kind of thing:
    kentpitman@climatejustice.socialK kentpitman@climatejustice.social

    @dougmerritt @wrog @ramin_hal9001 @screwlisp @cdegroot

    In effect, a Q register, what passed for storage in TECO, was something you can name in one bite. So 1,2mA meaning call what's in A with args 1 and 2 was a high-level language function call with two arguments that fit into a single machine word. Even the PDP-10 pushj instruction, which was pretty sophisticated as a way of calling a function, couldn't pass arguments with that degree of compactness.

    Uncategorized lispygopher gopher lisp commonlisp

  • @screwlisp is having some site connectivity problems so asked me to remind everyone that we'll be on the anonradio forum at the top of the hour (a bit less than ten minutes hence) for those who like that kind of thing:
    kentpitman@climatejustice.socialK kentpitman@climatejustice.social

    @dougmerritt @wrog @ramin_hal9001 @screwlisp @cdegroot

    TECO was a necessary innovation under word-addressed memory. With 36 bits per word, you couldn't afford that much space for an instruction. 5 7-bit bytes (with a bit left over) 8n one word was a lot more compact than an assembly instruction. With only 256 KW (kilowords) total addressable in 18 bits, you had to get all the power packed in you could. And we didn't have WYSIWYG yet, and most computer people couldn't type. So it would make a lot more sense to you if you were doing hunt and peck with almost no visibility into what you're changing. Typing -3cifoo$$ to mean go back three characters and insert foo and show me what the few characters around my cursor look like was extremely natural in context. That it became a programming language was a natural extension of that so that you didn't have to keep typing the same things over and over again.

    Uncategorized lispygopher gopher lisp commonlisp

  • @screwlisp is having some site connectivity problems so asked me to remind everyone that we'll be on the anonradio forum at the top of the hour (a bit less than ten minutes hence) for those who like that kind of thing:
    kentpitman@climatejustice.socialK kentpitman@climatejustice.social

    @ramin_hal9001

    I'm not 100% positive I understand your use of constraint here, but I think it is more substantive than that. If you want to use the metaphor you've chosen, a haiku reaches close to theoretical minimum of what can be compressed into a statement, while a long-form essay does not. This metaphor is not perfect, though, and will lead astray if looked at too closely, causing an excess focus on differential size, which is not actually the key issue to me.
    I won't do it here, but as I've alluded to more than once I think on the LispyGopher show, I believe that it is possible to rigorously assign cost to the loss of expression between languages.

    That is, that a transformation of expressional form is not, claims of Turing equivalence notwithstanding, cost-free both in terms of efficiency and in terms of expressional equivalence of the language. It has implications (positive or negative) any time you make such changes.

    Put another way, I no longer believe in Turing Equivalence as a practical truth, even if it has theoretical basis.

    And I am pretty sure the substantive loss can be expressed rigorously, if someone cared to do it, but because I'm not a formalist, I'm lazy about sketching how to do that in writing, though I think I did so verbally in one of those episodes.

    It's in my queue to write about. For now I'll just rest on bold claims. 🙂 Hey, it got Fermat quite a ways, right?

    But also, I had a conversation with ChatGPT recently where I convinced it of my position and it says I should write it up... for whatever that's worth. 🙂

    cc @screwlisp @wrog @dougmerritt @cdegroot

    Uncategorized lispygopher gopher lisp commonlisp

  • @screwlisp is having some site connectivity problems so asked me to remind everyone that we'll be on the anonradio forum at the top of the hour (a bit less than ten minutes hence) for those who like that kind of thing:
    kentpitman@climatejustice.socialK kentpitman@climatejustice.social

    @dougmerritt @screwlisp @cdegroot @ramin_hal9001

    Well, I'm just trying to explain why hygiene seems more like a crisis to the Scheme community than it did to the CL community, who mostly asked "why is this a big deal?". It is a big deal in Scheme. And it's not because of the mindset, it's because different designs favor different outcomes.

    The CL community would have been outraged if we overcomplicated macros, while the Scheme community was grateful for safety they actually perceived a need for, in other words.

    So yes, "the master is which community you aim to serve". We agree on that. 🙂

    Uncategorized lispygopher gopher lisp commonlisp

  • @screwlisp is having some site connectivity problems so asked me to remind everyone that we'll be on the anonradio forum at the top of the hour (a bit less than ten minutes hence) for those who like that kind of thing:
    kentpitman@climatejustice.socialK kentpitman@climatejustice.social

    @wrog @screwlisp @cdegroot @ramin_hal9001

    Thanks for this detailed reply. Lotta good stuff there. Also thanks especially for indulging the improper fraction. I mostly do not use the fractional labeling for posts for fear of that scenario. Sometimes you promised to stop and then realize you want to keep going and feel impeded. I'm glad you kept on.

    Uncategorized lispygopher gopher lisp commonlisp

  • @screwlisp is having some site connectivity problems so asked me to remind everyone that we'll be on the anonradio forum at the top of the hour (a bit less than ten minutes hence) for those who like that kind of thing:
    kentpitman@climatejustice.socialK kentpitman@climatejustice.social

    @ramin_hal9001 @screwlisp @wrog @dougmerritt @cdegroot

    The LispM did a nice thing (at some tremendous cost in hardware, I guess, but useful in the early days) by having various kinds of forwarding pointers for this. At least you knew you were going to incur overhead, though, and pricing it properly at least said there was a premium for not side-effecting and tended to cause people to not do it. And the copying GC could fix the problem eventually, so you didn't pay the price forever, though you did pay for having such specific hardware or for cycles in systems trying to emulate that which couldn't hide the overhead cost. I tend to prefer the pricing model over the prohibition model, but I see both sides of that.

    If my memory is correct (so yduJ or wrog please fix me if I goof this): MOO, as a language, is in an interesting space in that actual objects are mutable but list structure is not. This observes that it's very unlikely that you allocated an actual object (what CL would call standard class, but the uses are different in MOO because all of those objects are persistent and less likely to be allocated casually, so less likely to be garbage the GC would want to be involved in anyway).

    I always say "good" or "bad" is true in a context. It's not true that side effect is good or bad in the abstract, it's a property of how it engages the ecology of other operations and processes.

    And, Ramin, the abolishing of mutable variables has other intangible expressional costs, so it's not a simple no-brainer. But yes, if people are locked into a mindset that says such changes couldn't improve performance, they'd be surprised. Ultimately, I prefer to design languages around how people want to express things, and I like occasionally doing mutation even if it's not common, so I like languages that allow it and don't mind if there's a bit of a penalty for it or if one says "don't do this a lot because it's not aesthetic or not efficient or whatever".

    To make a really crude analogy, one has free speech in a society not to say the ordinary things one needs to say. Those things are favored speech regardless because people want a society where they can do ordinary things. Free speech is everything about preserving the right to say things that are not popular. So it is not accidental that there are controversies about it. But it's still nice to have it in those situations where you're outside of norms for reasonable reasons. 🙂

    Uncategorized lispygopher gopher lisp commonlisp

  • @screwlisp is having some site connectivity problems so asked me to remind everyone that we'll be on the anonradio forum at the top of the hour (a bit less than ten minutes hence) for those who like that kind of thing:
    kentpitman@climatejustice.socialK kentpitman@climatejustice.social

    @dougmerritt @screwlisp @cdegroot @ramin_hal9001

    > it seems you are saying that Lisp macros aren't so bad if their use is constrained to safe uses

    Well, what I'm saying isn't formal, and that in itself bugs some people. But the usual criticism of the CL system isn't that "people have to be careful", it's that "ordinary use is not safe". But there's safe and then there's safe.

    There is a sense in which C is objectively less safe than, say, Python or Lisp. And there is a sense in which people who write languages that aspire to more proofs think those languages still are not safe. So there's a bit of a continuum here that makes terminology tricky, so I have to make some assumptions that are fragile because some after-the-fact dodging can be done where critics do not acknowledge the incremental strengths, they just keep pointing out other problems as if that's what they meant all along.

    In scheme, and ignoring that you could do this functionally, writing a macro foo that takes an argument and yields the list of that argument can't look like `(list ,thing) because if used in some some situation like (define (bar list) (foo list)) you would fall victim to namespace clashes. And so scheme people dislike this paradigm. But even without careful planning, the same probably is FAR LESS likely to happen in CL because:

    Parameters that might get captured are usually in the variable namespace. You CAN bind functions, but it's rare, and it's super-rare for the names chosen to be things that would be the name of a pre-defined function. you'd have to be in some context where someone had done (flet ((list ...)) ....) for the list function to be bound to something unexpected, and even then you're not intended to bind list to something unexpected for other reasons, mainly that the symbol list is shared.

    I allege that in the natural course of things, it's FAR more rare for the expansion of a macro to ever contain something that would get unexpectedly captured, for reasons that do not exist in the scheme world. Formally, yes, there is still a risk, but what makes this such an urgency in the Scheme world are the choices to have a Lisp1 and the choice to have no package system. Each of these things creates an insulation. In practice, the functional part of the CL world does not vary, as uses of FLET are very rare. And it's equally rare for a macro to expand into free references that are not functional references.

    Also, the CL world has gensyms easily available, and CL systems often have other mechanisms that package up their use to be easy. In the Scheme world, there is no gensym and the language semantics is not defined on objects but on the notation itself. This makes things hard to compare, but it doesn't make it easy to see how package separation also eliminates a broad class of the surprise, because usually you know what's in your own package and aren't affected by what's in someone else's where in scheme symbols are symbols and it's far more dangerous to just be relying on lexical context to sort everything out.

    So yes, CL is less dangerous if you limit yourself, but also it's less dangerous because a lot of times you don't have to think hard about limiting yourself. The languages features it has create naturally-more-safe situations. Note I am making a relative, not an absolute measurement of safety. I'm saying if CL were full of the conflict opportunities that Scheme is, we'd have rushed to use hygiene, too. But mostly it wasn't, so no one felt the urge.

    Uncategorized lispygopher gopher lisp commonlisp

  • @screwlisp is having some site connectivity problems so asked me to remind everyone that we'll be on the anonradio forum at the top of the hour (a bit less than ten minutes hence) for those who like that kind of thing:
    kentpitman@climatejustice.socialK kentpitman@climatejustice.social

    @nosrednayduj @screwlisp @cdegroot

    Also Naomi Klein's book The Shock Doctrine, very politically relevant this week, traces a lot of political ills to Milton Friedman and his ideas.

    Link Preview Image
    The Shock Doctrine: The Rise of Disaster Capitalism

    Read 4,617 reviews from the world’s largest community for readers. In her ground-breaking reporting from Iraq, Naomi Klein exposed how the trauma of invasi…

    favicon

    Goodreads (www.goodreads.com)

    Uncategorized lispygopher gopher lisp commonlisp

  • @screwlisp is having some site connectivity problems so asked me to remind everyone that we'll be on the anonradio forum at the top of the hour (a bit less than ten minutes hence) for those who like that kind of thing:
    kentpitman@climatejustice.socialK kentpitman@climatejustice.social

    @nosrednayduj @screwlisp @cdegroot

    And, unrelated, another reference I made in the show as to Clyde Prestowitz and book The Betrayal of American Prosperity.
    https://www.goodreads.com/book/show/8104391-the-betrayal-of-american-prosperity

    Also an essay I wrote that summarizes a key point from it, though not really related to the topic of the show. I mention it just because that point will also be interesting maybe to this audience on the issue of capitalism if not on the specific economic issue we were talking about tonight:
    https://netsettlement.blogspot.com/2012/01/losing-war-in-quiet-room.html

    Uncategorized lispygopher gopher lisp commonlisp

  • @screwlisp is having some site connectivity problems so asked me to remind everyone that we'll be on the anonradio forum at the top of the hour (a bit less than ten minutes hence) for those who like that kind of thing:
    kentpitman@climatejustice.socialK kentpitman@climatejustice.social

    @ramin_hal9001 @loke @screwlisp @cdegroot

    I'm glad you mentioned delimited continuations. They go overlooked a lot.

    Uncategorized lispygopher gopher lisp commonlisp

  • @screwlisp is having some site connectivity problems so asked me to remind everyone that we'll be on the anonradio forum at the top of the hour (a bit less than ten minutes hence) for those who like that kind of thing:
    kentpitman@climatejustice.socialK kentpitman@climatejustice.social

    @nosrednayduj @screwlisp @cdegroot

    First, thanks for raising that example. It's interesting and contains info I hadn't heard.

    In a way, it underscores my point: that for a while, it was an open question whether we could implement GC, but a bet was made that we could.

    You could view that as saying they only implemented part of Lisp, and that the malloc stuff was a stepping out of paradigm, an admission the bet was failing for them in that moment. Or you could view it as a success, saying that even though some limping was required of Lisps while we refined the points, it was done.

    As I recall, there was some discussion of adding a GC function. At the time, the LispM people probably said "which GC would it invoke" and the Gensym people probably said "we don't have one". That was the kind of complexity that the ANSI process turned up and it's probably why there is no GC function. (There was one in Maclisp that invoked the Mark/Sweep GC, but the situation had become more complicated.)

    Also, as an aside, a personal observation about the process: With GC, as with other things like buffered streams, one of the hardest things to get agreement on was something where one party wanted a feature and another said "we don't have that, I'd have to make it a no-op". Making it a no-op was not a lot of implementation work. Just seeing and discarding an arg. But it complicated the story that was told, and vendors didn't like it, so they pushed back even though of all the implementations they had the easiest path (if you didn't count "explaining" as part of the path).

    Uncategorized lispygopher gopher lisp commonlisp

  • @screwlisp is having some site connectivity problems so asked me to remind everyone that we'll be on the anonradio forum at the top of the hour (a bit less than ten minutes hence) for those who like that kind of thing:
    kentpitman@climatejustice.socialK kentpitman@climatejustice.social

    At the end of @screwlisp's show, in the discussion of @cdegroot's book, @ramin_hal9001 was talking about continuations. I wanted to make a random point that isn't often made about Lisp that I think is important.

    I often do binary partitions of languages (like the static/dynamic split, but more exotic), and one of them is whether they are leading or following, let's say. there are some aspects in which scheme is a follower, not a leader, in the sense that it tends to eschew some things that Common Lisp does for a variety of reasons, but one of them is "we don't know how to compile this well". There is a preference for a formal semantics that is very tight and that everything is well-understood. It is perhaps fortunate that Scheme came along after garbage collection was well-worked and did not seem to fear that it would be a problem, but I would say that Lisp had already basically dealt led on garbage collection.

    The basic issue is this: Should a language incorporate things that maybe are not really well-understood but just because people need to do them and on an assumption that they might as well standardize the 'gesture' (to use the CLIM terminology) or 'notation' (to use the more familiar) for saying you want to do that thing.

    Scheme did not like Lisp macros, for example, and only adopted macros when hygienic macros were worked out. Lisp, on the other hand, started with the idea that macros were just necessary and worried about the details of making them sound later.

    Scheme people (and I'm generalizing to make a point here, with apologies for casting an entire group with a broad brush that is probably unfair) think Common Lisp macros more unhygienic than they actually are because they don't give enough credit to things like he package system, which Scheme does not have, and which protects CL users a lot more than they give credit for in avoiding collisions. They also don't fairly understand the degree to which Lisp2 protects from the most common scenarios that would happen all the time in Scheme if there were a symbol-based macro system. So CL isn't really as much at risk these days, but it was a bigger issue before packages, and the point is that Lisp decided it would figure out how to tighten later, but that it was too important to leave out, where Scheme held back design until it knew.

    But, and this is where I wanted to get to, Scheme led on continuations. That's a hard problem and while it's possible, it's still difficult. I don't quite remember if the original language feature had fully worked through all the tail call situations in the way that ultimately it did. But it was brave to say that full continuations could be made adequately efficient.

    And the Lisp community in general, and here I will include Scheme in that, though on other days I think these communities sufficiently different that I would not, have collectively been much more brave and leading than many languages, which only grudgingly allow functionality that they know how to compile.

    In the early days of Lisp, the choice to do dynamic memory management was very brave. It took a long time to make GC's efficient, and generational GC was what finally I think made people believe this could be done well in large address spaces. (In small address spaces, it was possible because touching all the memory to do a GC did not introduce thrashing if data was "paged out". And in modern hardware, memory is cheap, so the size is not always a per se issue.

    But there was an intermediate time in which lots of memory was addressable but not fully realized as RAM, only virtualized, and GC was a mess in that space.

    The Lisp Machines had 3 different unrelated but co-resident and mutually usable garbage collection strategies that could be separately enabled, 2 of them using hardware support (typed pointers) and one of them requiring that computation cease for a while because the virtual machine would be temporarily inconsistent for the last-ditch thing that particular GC could do to save the day when otherwise things were going to fail badly.

    For a while, dynamic memory management would not be used in real time applications, but ultimately the bet Lisp had made on it proved that it could be done, and it drove the doing of it in a way that holding back would not have.

    My (possibly faulty) understanding is that the Java GC was made to work by at least some displaced Lisp GC experts, for example. But certainly the choice to make Java be garbage collected probably derives from the Lispers on its design team feeling it was by then a solved problem.

    This aspect of languages' designs, whether they lead or follow, whether they are brave or timid, is not often talked about. But i wanted to give the idea some air. It's cool to have languages that can use existing tech well, but cooler I personally think to see designers consciously driving the creation of such tech.

    Uncategorized lispygopher gopher lisp commonlisp

  • @screwlisp is having some site connectivity problems so asked me to remind everyone that we'll be on the anonradio forum at the top of the hour (a bit less than ten minutes hence) for those who like that kind of thing:
    kentpitman@climatejustice.socialK kentpitman@climatejustice.social

    @screwlisp is having some site connectivity problems so asked me to remind everyone that we'll be on the anonradio forum at the top of the hour (a bit less than ten minutes hence) for those who like that kind of thing:

    https://anonradio.net:8443/anonradio

    He'll also be monitoring LambdaMOO at "telnet lambda.moo.mud.org 8888" for those who do that kind of thing. there are also emacs clients you should get if you're REALLY using telnet.

    Topic for today, I'm told, may include the climate, the war, the oil price hikes, some rambles I've recently posted on CLIM, and the book by @cdegroot called The Genius of Lisp, which we'll also revisit again next week.

    cc @ramin_hal9001

    #LispyGopher #Gopher #Lisp #CommonLisp

    Uncategorized lispygopher gopher lisp commonlisp
  • Login

  • Login or register to search.
  • First post
    Last post
0
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups