Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Cyborg)
  • No Skin
Collapse
Brand Logo

CIRCLE WITH A DOT

  1. Home
  2. Uncategorized
  3. @screwlisp is having some site connectivity problems so asked me to remind everyone that we'll be on the anonradio forum at the top of the hour (a bit less than ten minutes hence) for those who like that kind of thing:

@screwlisp is having some site connectivity problems so asked me to remind everyone that we'll be on the anonradio forum at the top of the hour (a bit less than ten minutes hence) for those who like that kind of thing:

Scheduled Pinned Locked Moved Uncategorized
lispygophergopherlispcommonlisp
120 Posts 14 Posters 163 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • wrog@mastodon.murkworks.netW wrog@mastodon.murkworks.net

    @kentpitman @screwlisp @cdegroot @ramin_hal9001

    There were other weirdnesses as well.

    Even if GC saves you the horror of referencing freed storage, or freeing stuff twice, you still have to worry about memory leaks and moreover, dropping references as fast as you can matters

    With copying GC, leaks are useless shit that has to be copied -- yes it eventually ends up in an old generation but until then it's getting copied -- and copying is where generational GC is doing work, and it's stuff unnecessarily surviving to the medium term that hurts you the most (generational GC *relies* on stuff becoming garbage as quickly as possible)

    And so, tracking down leaks and finding places to put in weak pointers started mattering more...

    4/3

    screwlisp@gamerplus.orgS This user is from outside of this forum
    screwlisp@gamerplus.orgS This user is from outside of this forum
    screwlisp@gamerplus.org
    wrote last edited by
    #18

    @wrog
    Did you see the garbage collection handbook's note on performance depending on having about five times as much memory as was technically needed? @dougmerritt
    @kentpitman @cdegroot @ramin_hal9001

    wrog@mastodon.murkworks.netW 1 Reply Last reply
    0
    • screwlisp@gamerplus.orgS screwlisp@gamerplus.org

      @wrog
      Did you see the garbage collection handbook's note on performance depending on having about five times as much memory as was technically needed? @dougmerritt
      @kentpitman @cdegroot @ramin_hal9001

      wrog@mastodon.murkworks.netW This user is from outside of this forum
      wrog@mastodon.murkworks.netW This user is from outside of this forum
      wrog@mastodon.murkworks.net
      wrote last edited by
      #19

      @screwlisp @kentpitman @cdegroot @ramin_hal9001 @dougmerritt

      5? maybe for mark&sweep

      but I can't see how more than 2 would ever be necessary for a copying GC. Once you have enough space to copy everything *to* (on the off-chance that absolutely everything actually *needs* to be copied), you're basically done...

      ... and if you're following the usual pattern where 90% of what you create becomes garbage almost immediately, you can get by with far less.

      ramin_hal9001@fe.disroot.orgR dougmerritt@mathstodon.xyzD 2 Replies Last reply
      0
      • kentpitman@climatejustice.socialK kentpitman@climatejustice.social

        @nosrednayduj @screwlisp @cdegroot

        And, unrelated, another reference I made in the show as to Clyde Prestowitz and book The Betrayal of American Prosperity.
        https://www.goodreads.com/book/show/8104391-the-betrayal-of-american-prosperity

        Also an essay I wrote that summarizes a key point from it, though not really related to the topic of the show. I mention it just because that point will also be interesting maybe to this audience on the issue of capitalism if not on the specific economic issue we were talking about tonight:
        https://netsettlement.blogspot.com/2012/01/losing-war-in-quiet-room.html

        kentpitman@climatejustice.socialK This user is from outside of this forum
        kentpitman@climatejustice.socialK This user is from outside of this forum
        kentpitman@climatejustice.social
        wrote last edited by
        #20

        @nosrednayduj @screwlisp @cdegroot

        Also Naomi Klein's book The Shock Doctrine, very politically relevant this week, traces a lot of political ills to Milton Friedman and his ideas.

        https://www.goodreads.com/book/show/1237300.The_Shock_Doctrine

        1 Reply Last reply
        1
        0
        • wrog@mastodon.murkworks.netW wrog@mastodon.murkworks.net

          @screwlisp @kentpitman @cdegroot @ramin_hal9001 @dougmerritt

          5? maybe for mark&sweep

          but I can't see how more than 2 would ever be necessary for a copying GC. Once you have enough space to copy everything *to* (on the off-chance that absolutely everything actually *needs* to be copied), you're basically done...

          ... and if you're following the usual pattern where 90% of what you create becomes garbage almost immediately, you can get by with far less.

          ramin_hal9001@fe.disroot.orgR This user is from outside of this forum
          ramin_hal9001@fe.disroot.orgR This user is from outside of this forum
          ramin_hal9001@fe.disroot.org
          wrote last edited by
          #21

          @wrog@mastodon.murkworks.net Haskell was first invented in 1990 or 91ish, and at that time they had already started to ask questions like, "what if we just ban set! entirely," abolish mutable variables, make everything lazily evaluated by default. If you have been programming in C/C++ for a while, that abolishing mutable variables would lead to a performance increase seems very counter-intuitive.

          But for all the reasons you mentioned about not forcing a search for updated pointers in old-generation GC heaps, and also the fact that this forces the programmer to write their source code such that it is essentially already in the Static-Single-Assignment (SSA) form, which is nowadays an optimization pass that most compilers do prior to register allocation, this allowed for more aggressive optimization to be used and results in more efficient code.

          @screwlisp@gamerplus.org @kentpitman@climatejustice.social @cdegroot@mstdn.ca @dougmerritt@mathstodon.xyz

          kentpitman@climatejustice.socialK 1 Reply Last reply
          0
          • wrog@mastodon.murkworks.netW wrog@mastodon.murkworks.net

            @kentpitman @screwlisp @cdegroot @ramin_hal9001

            See, setq/set! is a total disaster for generational GC. It bashes old-space cells to point to new-space; the premise of generational GC being that this mostly shouldn't happen. The super-often new-generation-only pass is now doing a whole lot of old-space traversal because of all of those cells added to the root set by the set! calls, ... which then loses most of the benefit of generational GC.

            (fluid-let and dynamic-wind also became way LESS cheap, mainly due to missing multiple optimization opportunities)

            In short, with generational GC, straightforward side-effect-free code wins. It took a while for me to recalibrate my intuitions re what sorts of things were fast/cheap vs not.

            3/3

            cdegroot@mstdn.caC This user is from outside of this forum
            cdegroot@mstdn.caC This user is from outside of this forum
            cdegroot@mstdn.ca
            wrote last edited by
            #22

            @wrog @kentpitman @screwlisp @ramin_hal9001 it's a good chunk of the reason why Erlang shines here. Per-process GC can be kept simple (a process is more like an object than a thread, so you have lots of them) and no equivalent of setq - all data is immutable.

            (there is a shared heap, but that also is just immutable data).

            ramin_hal9001@fe.disroot.orgR 1 Reply Last reply
            1
            0
            • R relay@relay.mycrowd.ca shared this topic
            • cdegroot@mstdn.caC cdegroot@mstdn.ca

              @wrog @kentpitman @screwlisp @ramin_hal9001 it's a good chunk of the reason why Erlang shines here. Per-process GC can be kept simple (a process is more like an object than a thread, so you have lots of them) and no equivalent of setq - all data is immutable.

              (there is a shared heap, but that also is just immutable data).

              ramin_hal9001@fe.disroot.orgR This user is from outside of this forum
              ramin_hal9001@fe.disroot.orgR This user is from outside of this forum
              ramin_hal9001@fe.disroot.org
              wrote last edited by
              #23

              @cdegroot@mstdn.ca yes, the BEAM virtual machine is pretty amazing technology, there are very good reasons why it is used in telecom, or other scenarios where zero downtime is a priority. I think .NET and Graal are have been slowly incorporating more of BEAM's features into their own runtimes. Since about 3 years ago .NET can do "hot code reloading," for example.

              I have used Erlang before but not Elixer. I think I would like Elixer better because of it's slighly-more-Haskell-like type system.

              @wrog@mastodon.murkworks.net @kentpitman@climatejustice.social @screwlisp@gamerplus.org

              cdegroot@mstdn.caC 1 Reply Last reply
              0
              • ramin_hal9001@fe.disroot.orgR ramin_hal9001@fe.disroot.org

                @cdegroot@mstdn.ca yes, the BEAM virtual machine is pretty amazing technology, there are very good reasons why it is used in telecom, or other scenarios where zero downtime is a priority. I think .NET and Graal are have been slowly incorporating more of BEAM's features into their own runtimes. Since about 3 years ago .NET can do "hot code reloading," for example.

                I have used Erlang before but not Elixer. I think I would like Elixer better because of it's slighly-more-Haskell-like type system.

                @wrog@mastodon.murkworks.net @kentpitman@climatejustice.social @screwlisp@gamerplus.org

                cdegroot@mstdn.caC This user is from outside of this forum
                cdegroot@mstdn.caC This user is from outside of this forum
                cdegroot@mstdn.ca
                wrote last edited by
                #24

                @ramin_hal9001 @kentpitman @screwlisp @wrog not just zero downtime, the more important aspect is how it does concurrency, how it manages to scale that, and how well it fits the modern requirements of "webapps" (like a glove).

                It changed my thinking about objects, just like Smalltalk did before. I'm fully on board with Joe Armstrong's quip that Erlang is "the most OO language" (or something to that extent); having objects with effectively their own address space, their own processor scheduling, etc, completely changes how you think about building scalable concurrent systems (and _then_ you get clustering for free, and sometimes hot reloading is a production thing, although 99% of the time it is good to have it in the REPL)

                1 Reply Last reply
                1
                0
                • wrog@mastodon.murkworks.netW wrog@mastodon.murkworks.net

                  @screwlisp @kentpitman @cdegroot @ramin_hal9001 @dougmerritt

                  5? maybe for mark&sweep

                  but I can't see how more than 2 would ever be necessary for a copying GC. Once you have enough space to copy everything *to* (on the off-chance that absolutely everything actually *needs* to be copied), you're basically done...

                  ... and if you're following the usual pattern where 90% of what you create becomes garbage almost immediately, you can get by with far less.

                  dougmerritt@mathstodon.xyzD This user is from outside of this forum
                  dougmerritt@mathstodon.xyzD This user is from outside of this forum
                  dougmerritt@mathstodon.xyz
                  wrote last edited by
                  #25

                  @wrog
                  > but I can't see how more than 2 would ever be necessary for a copying GC

                  It's not "necessary", it's "to make GC performance a negligeable percentage of overall CPU".

                  It was about a theoretical worst case as I recall, certainly not about one particular algorithm.

                  And IIRC it was actually a factor of 7 -- 5 is merely a good mnemonic which may be close enough. (e.g. perhaps 5-fold keeps overhead down to 10-20% rather than 7's 1%, although I'm making it up to give the flavor -- I haven't read the book for 10-20 years)

                  But see the book (may as well use the second edition) if and when you care; it's excellent. Mandatory I would say, for anyone who wants to really really understand all aspects of garbage collection, including performance issues.

                  @screwlisp @kentpitman @cdegroot @ramin_hal9001

                  wrog@mastodon.murkworks.netW 1 Reply Last reply
                  0
                  • kentpitman@climatejustice.socialK kentpitman@climatejustice.social

                    At the end of @screwlisp's show, in the discussion of @cdegroot's book, @ramin_hal9001 was talking about continuations. I wanted to make a random point that isn't often made about Lisp that I think is important.

                    I often do binary partitions of languages (like the static/dynamic split, but more exotic), and one of them is whether they are leading or following, let's say. there are some aspects in which scheme is a follower, not a leader, in the sense that it tends to eschew some things that Common Lisp does for a variety of reasons, but one of them is "we don't know how to compile this well". There is a preference for a formal semantics that is very tight and that everything is well-understood. It is perhaps fortunate that Scheme came along after garbage collection was well-worked and did not seem to fear that it would be a problem, but I would say that Lisp had already basically dealt led on garbage collection.

                    The basic issue is this: Should a language incorporate things that maybe are not really well-understood but just because people need to do them and on an assumption that they might as well standardize the 'gesture' (to use the CLIM terminology) or 'notation' (to use the more familiar) for saying you want to do that thing.

                    Scheme did not like Lisp macros, for example, and only adopted macros when hygienic macros were worked out. Lisp, on the other hand, started with the idea that macros were just necessary and worried about the details of making them sound later.

                    Scheme people (and I'm generalizing to make a point here, with apologies for casting an entire group with a broad brush that is probably unfair) think Common Lisp macros more unhygienic than they actually are because they don't give enough credit to things like he package system, which Scheme does not have, and which protects CL users a lot more than they give credit for in avoiding collisions. They also don't fairly understand the degree to which Lisp2 protects from the most common scenarios that would happen all the time in Scheme if there were a symbol-based macro system. So CL isn't really as much at risk these days, but it was a bigger issue before packages, and the point is that Lisp decided it would figure out how to tighten later, but that it was too important to leave out, where Scheme held back design until it knew.

                    But, and this is where I wanted to get to, Scheme led on continuations. That's a hard problem and while it's possible, it's still difficult. I don't quite remember if the original language feature had fully worked through all the tail call situations in the way that ultimately it did. But it was brave to say that full continuations could be made adequately efficient.

                    And the Lisp community in general, and here I will include Scheme in that, though on other days I think these communities sufficiently different that I would not, have collectively been much more brave and leading than many languages, which only grudgingly allow functionality that they know how to compile.

                    In the early days of Lisp, the choice to do dynamic memory management was very brave. It took a long time to make GC's efficient, and generational GC was what finally I think made people believe this could be done well in large address spaces. (In small address spaces, it was possible because touching all the memory to do a GC did not introduce thrashing if data was "paged out". And in modern hardware, memory is cheap, so the size is not always a per se issue.

                    But there was an intermediate time in which lots of memory was addressable but not fully realized as RAM, only virtualized, and GC was a mess in that space.

                    The Lisp Machines had 3 different unrelated but co-resident and mutually usable garbage collection strategies that could be separately enabled, 2 of them using hardware support (typed pointers) and one of them requiring that computation cease for a while because the virtual machine would be temporarily inconsistent for the last-ditch thing that particular GC could do to save the day when otherwise things were going to fail badly.

                    For a while, dynamic memory management would not be used in real time applications, but ultimately the bet Lisp had made on it proved that it could be done, and it drove the doing of it in a way that holding back would not have.

                    My (possibly faulty) understanding is that the Java GC was made to work by at least some displaced Lisp GC experts, for example. But certainly the choice to make Java be garbage collected probably derives from the Lispers on its design team feeling it was by then a solved problem.

                    This aspect of languages' designs, whether they lead or follow, whether they are brave or timid, is not often talked about. But i wanted to give the idea some air. It's cool to have languages that can use existing tech well, but cooler I personally think to see designers consciously driving the creation of such tech.

                    dougmerritt@mathstodon.xyzD This user is from outside of this forum
                    dougmerritt@mathstodon.xyzD This user is from outside of this forum
                    dougmerritt@mathstodon.xyz
                    wrote last edited by
                    #26

                    @kentpitman
                    I respect you, and your contributions to Lisp and the community. So I dislike nitpicking you. But:

                    > Common Lisp macros more unhygienic than they actually are

                    This is a biased phrasing. There are hygenic macro systems, and unhygenic macro systems. One cannot assign a degree of "hygenic-ness" without simultaneously defining what metric you are introducing.

                    We all can agree that one can produce great code in Common Lisp. It's not like Scheme is *necessary* for that.

                    But de gustibus non est disputandum. There are objective qualities of various macro systems -- and then there's people's preferences about those qualities.

                    Bottom line: it seems you are saying that Lisp macros aren't so bad if their use is constrained to safe uses, and I would agree with *that*.

                    @screwlisp @cdegroot @ramin_hal9001

                    kentpitman@climatejustice.socialK 1 Reply Last reply
                    0
                    • kentpitman@climatejustice.socialK kentpitman@climatejustice.social

                      At the end of @screwlisp's show, in the discussion of @cdegroot's book, @ramin_hal9001 was talking about continuations. I wanted to make a random point that isn't often made about Lisp that I think is important.

                      I often do binary partitions of languages (like the static/dynamic split, but more exotic), and one of them is whether they are leading or following, let's say. there are some aspects in which scheme is a follower, not a leader, in the sense that it tends to eschew some things that Common Lisp does for a variety of reasons, but one of them is "we don't know how to compile this well". There is a preference for a formal semantics that is very tight and that everything is well-understood. It is perhaps fortunate that Scheme came along after garbage collection was well-worked and did not seem to fear that it would be a problem, but I would say that Lisp had already basically dealt led on garbage collection.

                      The basic issue is this: Should a language incorporate things that maybe are not really well-understood but just because people need to do them and on an assumption that they might as well standardize the 'gesture' (to use the CLIM terminology) or 'notation' (to use the more familiar) for saying you want to do that thing.

                      Scheme did not like Lisp macros, for example, and only adopted macros when hygienic macros were worked out. Lisp, on the other hand, started with the idea that macros were just necessary and worried about the details of making them sound later.

                      Scheme people (and I'm generalizing to make a point here, with apologies for casting an entire group with a broad brush that is probably unfair) think Common Lisp macros more unhygienic than they actually are because they don't give enough credit to things like he package system, which Scheme does not have, and which protects CL users a lot more than they give credit for in avoiding collisions. They also don't fairly understand the degree to which Lisp2 protects from the most common scenarios that would happen all the time in Scheme if there were a symbol-based macro system. So CL isn't really as much at risk these days, but it was a bigger issue before packages, and the point is that Lisp decided it would figure out how to tighten later, but that it was too important to leave out, where Scheme held back design until it knew.

                      But, and this is where I wanted to get to, Scheme led on continuations. That's a hard problem and while it's possible, it's still difficult. I don't quite remember if the original language feature had fully worked through all the tail call situations in the way that ultimately it did. But it was brave to say that full continuations could be made adequately efficient.

                      And the Lisp community in general, and here I will include Scheme in that, though on other days I think these communities sufficiently different that I would not, have collectively been much more brave and leading than many languages, which only grudgingly allow functionality that they know how to compile.

                      In the early days of Lisp, the choice to do dynamic memory management was very brave. It took a long time to make GC's efficient, and generational GC was what finally I think made people believe this could be done well in large address spaces. (In small address spaces, it was possible because touching all the memory to do a GC did not introduce thrashing if data was "paged out". And in modern hardware, memory is cheap, so the size is not always a per se issue.

                      But there was an intermediate time in which lots of memory was addressable but not fully realized as RAM, only virtualized, and GC was a mess in that space.

                      The Lisp Machines had 3 different unrelated but co-resident and mutually usable garbage collection strategies that could be separately enabled, 2 of them using hardware support (typed pointers) and one of them requiring that computation cease for a while because the virtual machine would be temporarily inconsistent for the last-ditch thing that particular GC could do to save the day when otherwise things were going to fail badly.

                      For a while, dynamic memory management would not be used in real time applications, but ultimately the bet Lisp had made on it proved that it could be done, and it drove the doing of it in a way that holding back would not have.

                      My (possibly faulty) understanding is that the Java GC was made to work by at least some displaced Lisp GC experts, for example. But certainly the choice to make Java be garbage collected probably derives from the Lispers on its design team feeling it was by then a solved problem.

                      This aspect of languages' designs, whether they lead or follow, whether they are brave or timid, is not often talked about. But i wanted to give the idea some air. It's cool to have languages that can use existing tech well, but cooler I personally think to see designers consciously driving the creation of such tech.

                      dougmerritt@mathstodon.xyzD This user is from outside of this forum
                      dougmerritt@mathstodon.xyzD This user is from outside of this forum
                      dougmerritt@mathstodon.xyz
                      wrote last edited by
                      #27

                      @kentpitman
                      > I don't quite remember if the original language feature had fully worked through all the tail call situations in the way that ultimately it did.

                      My memory is that the Scheme interface for continuations was completely worked out when Scheme was born, but implementation issues were not -- beyond existence proof that is.

                      > But it was brave to say that full continuations could be made adequately efficient.

                      Yes it was!

                      > the Lisp community in general, and here I will include Scheme in that

                      Planner, for instance, went in a quite different direction. Micro-Planner (and its SHRDLU) inspired Prolog. Robert Kowalski said that "Prolog is what Planner should have been" (it included unification but excluded pattern-directed invocation for example), see Kowalski, R. (1988). โ€œLogic Programming.โ€ Communications of the ACM, 31(9) -- although the precise phrasing I think is from interviews.

                      Anyway, Prolog was not a Lisp, but sure, definitely Scheme is. The history of Lisp spinoffs created quite a bit of CS history.

                      I did professional development in Scheme (at Autodesk, before that division was axed ๐Ÿ˜ž -- it's certainly a workable language in the real world.

                      But we know that Common Lisp is too, obviously.

                      @screwlisp @cdegroot @ramin_hal9001

                      1 Reply Last reply
                      0
                      • wrog@mastodon.murkworks.netW wrog@mastodon.murkworks.net

                        @kentpitman @screwlisp @cdegroot @ramin_hal9001

                        Generational GC changes the way you program and it's not *just* that it's efficient.

                        We used MIT-Scheme (which, by the early 90s was showing its age). We did all manner of weird optimizing to use memory efficiently. Lots of set! to re-use structure where possible. Or (map! f list) -- same as (map...) but with set-car! to modify in-place -- because it made a HUGE difference not recreating all of those cons cells => bumps memory use => next GC round is that much sooner (and then everything STOPS, because Mark & Sweep). Also stupid (fluid-let ...) tricks to save space in closures.

                        We were writing Scheme as if it were C because that was how you got speed in that particular world.

                        1/3

                        dougmerritt@mathstodon.xyzD This user is from outside of this forum
                        dougmerritt@mathstodon.xyzD This user is from outside of this forum
                        dougmerritt@mathstodon.xyz
                        wrote last edited by
                        #28

                        @wrog
                        'setq' and friends have been criticized forever, but avoiding mutation is easier said than done. Parsing arbitrarily large sexpr's requires mutation behind the scenes -- which ideally is where it should stay.

                        Any language we use that helps avoid mutation is a good thing. 100% avoidance is a matter of opinion -- some people claim it was proven to be fully avoidable decades ago, others say the jury is still out on the 100% part.

                        I don't know enough to have an opinion on whether 100% has been completely proven, but it's attractive.

                        @kentpitman @screwlisp @cdegroot @ramin_hal9001

                        1 Reply Last reply
                        0
                        • kentpitman@climatejustice.socialK kentpitman@climatejustice.social

                          At the end of @screwlisp's show, in the discussion of @cdegroot's book, @ramin_hal9001 was talking about continuations. I wanted to make a random point that isn't often made about Lisp that I think is important.

                          I often do binary partitions of languages (like the static/dynamic split, but more exotic), and one of them is whether they are leading or following, let's say. there are some aspects in which scheme is a follower, not a leader, in the sense that it tends to eschew some things that Common Lisp does for a variety of reasons, but one of them is "we don't know how to compile this well". There is a preference for a formal semantics that is very tight and that everything is well-understood. It is perhaps fortunate that Scheme came along after garbage collection was well-worked and did not seem to fear that it would be a problem, but I would say that Lisp had already basically dealt led on garbage collection.

                          The basic issue is this: Should a language incorporate things that maybe are not really well-understood but just because people need to do them and on an assumption that they might as well standardize the 'gesture' (to use the CLIM terminology) or 'notation' (to use the more familiar) for saying you want to do that thing.

                          Scheme did not like Lisp macros, for example, and only adopted macros when hygienic macros were worked out. Lisp, on the other hand, started with the idea that macros were just necessary and worried about the details of making them sound later.

                          Scheme people (and I'm generalizing to make a point here, with apologies for casting an entire group with a broad brush that is probably unfair) think Common Lisp macros more unhygienic than they actually are because they don't give enough credit to things like he package system, which Scheme does not have, and which protects CL users a lot more than they give credit for in avoiding collisions. They also don't fairly understand the degree to which Lisp2 protects from the most common scenarios that would happen all the time in Scheme if there were a symbol-based macro system. So CL isn't really as much at risk these days, but it was a bigger issue before packages, and the point is that Lisp decided it would figure out how to tighten later, but that it was too important to leave out, where Scheme held back design until it knew.

                          But, and this is where I wanted to get to, Scheme led on continuations. That's a hard problem and while it's possible, it's still difficult. I don't quite remember if the original language feature had fully worked through all the tail call situations in the way that ultimately it did. But it was brave to say that full continuations could be made adequately efficient.

                          And the Lisp community in general, and here I will include Scheme in that, though on other days I think these communities sufficiently different that I would not, have collectively been much more brave and leading than many languages, which only grudgingly allow functionality that they know how to compile.

                          In the early days of Lisp, the choice to do dynamic memory management was very brave. It took a long time to make GC's efficient, and generational GC was what finally I think made people believe this could be done well in large address spaces. (In small address spaces, it was possible because touching all the memory to do a GC did not introduce thrashing if data was "paged out". And in modern hardware, memory is cheap, so the size is not always a per se issue.

                          But there was an intermediate time in which lots of memory was addressable but not fully realized as RAM, only virtualized, and GC was a mess in that space.

                          The Lisp Machines had 3 different unrelated but co-resident and mutually usable garbage collection strategies that could be separately enabled, 2 of them using hardware support (typed pointers) and one of them requiring that computation cease for a while because the virtual machine would be temporarily inconsistent for the last-ditch thing that particular GC could do to save the day when otherwise things were going to fail badly.

                          For a while, dynamic memory management would not be used in real time applications, but ultimately the bet Lisp had made on it proved that it could be done, and it drove the doing of it in a way that holding back would not have.

                          My (possibly faulty) understanding is that the Java GC was made to work by at least some displaced Lisp GC experts, for example. But certainly the choice to make Java be garbage collected probably derives from the Lispers on its design team feeling it was by then a solved problem.

                          This aspect of languages' designs, whether they lead or follow, whether they are brave or timid, is not often talked about. But i wanted to give the idea some air. It's cool to have languages that can use existing tech well, but cooler I personally think to see designers consciously driving the creation of such tech.

                          dougmerritt@mathstodon.xyzD This user is from outside of this forum
                          dougmerritt@mathstodon.xyzD This user is from outside of this forum
                          dougmerritt@mathstodon.xyz
                          wrote last edited by
                          #29

                          @kentpitman
                          > 2 of them using hardware support (typed pointers)

                          I learned about typed pointers from Keith Sklower, from my brief involvement in the earliest days (1978?) of Berkeley's Franz Lisp (implemented in order to support the computer algebra Macsyma port to Vaxima), and it blew my mind. Horizons extended hugely.

                          A few years later everyone seemed to just take the idea in stride. Yet no one seems to comment on the impact on typed pointers made by big-endian versus little-endian architectures; everyone seems to regard it as a matter of taste. It's not always; it impacts low level implementations.

                          @screwlisp @cdegroot @ramin_hal9001

                          1 Reply Last reply
                          0
                          • kentpitman@climatejustice.socialK kentpitman@climatejustice.social

                            At the end of @screwlisp's show, in the discussion of @cdegroot's book, @ramin_hal9001 was talking about continuations. I wanted to make a random point that isn't often made about Lisp that I think is important.

                            I often do binary partitions of languages (like the static/dynamic split, but more exotic), and one of them is whether they are leading or following, let's say. there are some aspects in which scheme is a follower, not a leader, in the sense that it tends to eschew some things that Common Lisp does for a variety of reasons, but one of them is "we don't know how to compile this well". There is a preference for a formal semantics that is very tight and that everything is well-understood. It is perhaps fortunate that Scheme came along after garbage collection was well-worked and did not seem to fear that it would be a problem, but I would say that Lisp had already basically dealt led on garbage collection.

                            The basic issue is this: Should a language incorporate things that maybe are not really well-understood but just because people need to do them and on an assumption that they might as well standardize the 'gesture' (to use the CLIM terminology) or 'notation' (to use the more familiar) for saying you want to do that thing.

                            Scheme did not like Lisp macros, for example, and only adopted macros when hygienic macros were worked out. Lisp, on the other hand, started with the idea that macros were just necessary and worried about the details of making them sound later.

                            Scheme people (and I'm generalizing to make a point here, with apologies for casting an entire group with a broad brush that is probably unfair) think Common Lisp macros more unhygienic than they actually are because they don't give enough credit to things like he package system, which Scheme does not have, and which protects CL users a lot more than they give credit for in avoiding collisions. They also don't fairly understand the degree to which Lisp2 protects from the most common scenarios that would happen all the time in Scheme if there were a symbol-based macro system. So CL isn't really as much at risk these days, but it was a bigger issue before packages, and the point is that Lisp decided it would figure out how to tighten later, but that it was too important to leave out, where Scheme held back design until it knew.

                            But, and this is where I wanted to get to, Scheme led on continuations. That's a hard problem and while it's possible, it's still difficult. I don't quite remember if the original language feature had fully worked through all the tail call situations in the way that ultimately it did. But it was brave to say that full continuations could be made adequately efficient.

                            And the Lisp community in general, and here I will include Scheme in that, though on other days I think these communities sufficiently different that I would not, have collectively been much more brave and leading than many languages, which only grudgingly allow functionality that they know how to compile.

                            In the early days of Lisp, the choice to do dynamic memory management was very brave. It took a long time to make GC's efficient, and generational GC was what finally I think made people believe this could be done well in large address spaces. (In small address spaces, it was possible because touching all the memory to do a GC did not introduce thrashing if data was "paged out". And in modern hardware, memory is cheap, so the size is not always a per se issue.

                            But there was an intermediate time in which lots of memory was addressable but not fully realized as RAM, only virtualized, and GC was a mess in that space.

                            The Lisp Machines had 3 different unrelated but co-resident and mutually usable garbage collection strategies that could be separately enabled, 2 of them using hardware support (typed pointers) and one of them requiring that computation cease for a while because the virtual machine would be temporarily inconsistent for the last-ditch thing that particular GC could do to save the day when otherwise things were going to fail badly.

                            For a while, dynamic memory management would not be used in real time applications, but ultimately the bet Lisp had made on it proved that it could be done, and it drove the doing of it in a way that holding back would not have.

                            My (possibly faulty) understanding is that the Java GC was made to work by at least some displaced Lisp GC experts, for example. But certainly the choice to make Java be garbage collected probably derives from the Lispers on its design team feeling it was by then a solved problem.

                            This aspect of languages' designs, whether they lead or follow, whether they are brave or timid, is not often talked about. But i wanted to give the idea some air. It's cool to have languages that can use existing tech well, but cooler I personally think to see designers consciously driving the creation of such tech.

                            dougmerritt@mathstodon.xyzD This user is from outside of this forum
                            dougmerritt@mathstodon.xyzD This user is from outside of this forum
                            dougmerritt@mathstodon.xyz
                            wrote last edited by
                            #30

                            @kentpitman
                            >My (possibly faulty) understanding is that the Java GC was made to work by at least some displaced Lisp GC experts

                            I used to regularly talk to the technical lead for that group at Sun for unimportant reasons, and I have every reason to think that the entire team was absolutely brilliant.

                            I don't recall whether some of them were displaced Lisp GC experts, but I do recall that I had plenty of criticisms about Java the language, but tended to find few, if any, about Java the implementation. And they kept improving it.

                            @screwlisp @cdegroot @ramin_hal9001

                            1 Reply Last reply
                            0
                            • dougmerritt@mathstodon.xyzD dougmerritt@mathstodon.xyz

                              @kentpitman
                              I respect you, and your contributions to Lisp and the community. So I dislike nitpicking you. But:

                              > Common Lisp macros more unhygienic than they actually are

                              This is a biased phrasing. There are hygenic macro systems, and unhygenic macro systems. One cannot assign a degree of "hygenic-ness" without simultaneously defining what metric you are introducing.

                              We all can agree that one can produce great code in Common Lisp. It's not like Scheme is *necessary* for that.

                              But de gustibus non est disputandum. There are objective qualities of various macro systems -- and then there's people's preferences about those qualities.

                              Bottom line: it seems you are saying that Lisp macros aren't so bad if their use is constrained to safe uses, and I would agree with *that*.

                              @screwlisp @cdegroot @ramin_hal9001

                              kentpitman@climatejustice.socialK This user is from outside of this forum
                              kentpitman@climatejustice.socialK This user is from outside of this forum
                              kentpitman@climatejustice.social
                              wrote last edited by
                              #31

                              @dougmerritt @screwlisp @cdegroot @ramin_hal9001

                              > it seems you are saying that Lisp macros aren't so bad if their use is constrained to safe uses

                              Well, what I'm saying isn't formal, and that in itself bugs some people. But the usual criticism of the CL system isn't that "people have to be careful", it's that "ordinary use is not safe". But there's safe and then there's safe.

                              There is a sense in which C is objectively less safe than, say, Python or Lisp. And there is a sense in which people who write languages that aspire to more proofs think those languages still are not safe. So there's a bit of a continuum here that makes terminology tricky, so I have to make some assumptions that are fragile because some after-the-fact dodging can be done where critics do not acknowledge the incremental strengths, they just keep pointing out other problems as if that's what they meant all along.

                              In scheme, and ignoring that you could do this functionally, writing a macro foo that takes an argument and yields the list of that argument can't look like `(list ,thing) because if used in some some situation like (define (bar list) (foo list)) you would fall victim to namespace clashes. And so scheme people dislike this paradigm. But even without careful planning, the same probably is FAR LESS likely to happen in CL because:

                              Parameters that might get captured are usually in the variable namespace. You CAN bind functions, but it's rare, and it's super-rare for the names chosen to be things that would be the name of a pre-defined function. you'd have to be in some context where someone had done (flet ((list ...)) ....) for the list function to be bound to something unexpected, and even then you're not intended to bind list to something unexpected for other reasons, mainly that the symbol list is shared.

                              I allege that in the natural course of things, it's FAR more rare for the expansion of a macro to ever contain something that would get unexpectedly captured, for reasons that do not exist in the scheme world. Formally, yes, there is still a risk, but what makes this such an urgency in the Scheme world are the choices to have a Lisp1 and the choice to have no package system. Each of these things creates an insulation. In practice, the functional part of the CL world does not vary, as uses of FLET are very rare. And it's equally rare for a macro to expand into free references that are not functional references.

                              Also, the CL world has gensyms easily available, and CL systems often have other mechanisms that package up their use to be easy. In the Scheme world, there is no gensym and the language semantics is not defined on objects but on the notation itself. This makes things hard to compare, but it doesn't make it easy to see how package separation also eliminates a broad class of the surprise, because usually you know what's in your own package and aren't affected by what's in someone else's where in scheme symbols are symbols and it's far more dangerous to just be relying on lexical context to sort everything out.

                              So yes, CL is less dangerous if you limit yourself, but also it's less dangerous because a lot of times you don't have to think hard about limiting yourself. The languages features it has create naturally-more-safe situations. Note I am making a relative, not an absolute measurement of safety. I'm saying if CL were full of the conflict opportunities that Scheme is, we'd have rushed to use hygiene, too. But mostly it wasn't, so no one felt the urge.

                              dougmerritt@mathstodon.xyzD 1 Reply Last reply
                              0
                              • ramin_hal9001@fe.disroot.orgR ramin_hal9001@fe.disroot.org

                                @wrog@mastodon.murkworks.net Haskell was first invented in 1990 or 91ish, and at that time they had already started to ask questions like, "what if we just ban set! entirely," abolish mutable variables, make everything lazily evaluated by default. If you have been programming in C/C++ for a while, that abolishing mutable variables would lead to a performance increase seems very counter-intuitive.

                                But for all the reasons you mentioned about not forcing a search for updated pointers in old-generation GC heaps, and also the fact that this forces the programmer to write their source code such that it is essentially already in the Static-Single-Assignment (SSA) form, which is nowadays an optimization pass that most compilers do prior to register allocation, this allowed for more aggressive optimization to be used and results in more efficient code.

                                @screwlisp@gamerplus.org @kentpitman@climatejustice.social @cdegroot@mstdn.ca @dougmerritt@mathstodon.xyz

                                kentpitman@climatejustice.socialK This user is from outside of this forum
                                kentpitman@climatejustice.socialK This user is from outside of this forum
                                kentpitman@climatejustice.social
                                wrote last edited by
                                #32

                                @ramin_hal9001 @screwlisp @wrog @dougmerritt @cdegroot

                                The LispM did a nice thing (at some tremendous cost in hardware, I guess, but useful in the early days) by having various kinds of forwarding pointers for this. At least you knew you were going to incur overhead, though, and pricing it properly at least said there was a premium for not side-effecting and tended to cause people to not do it. And the copying GC could fix the problem eventually, so you didn't pay the price forever, though you did pay for having such specific hardware or for cycles in systems trying to emulate that which couldn't hide the overhead cost. I tend to prefer the pricing model over the prohibition model, but I see both sides of that.

                                If my memory is correct (so yduJ or wrog please fix me if I goof this): MOO, as a language, is in an interesting space in that actual objects are mutable but list structure is not. This observes that it's very unlikely that you allocated an actual object (what CL would call standard class, but the uses are different in MOO because all of those objects are persistent and less likely to be allocated casually, so less likely to be garbage the GC would want to be involved in anyway).

                                I always say "good" or "bad" is true in a context. It's not true that side effect is good or bad in the abstract, it's a property of how it engages the ecology of other operations and processes.

                                And, Ramin, the abolishing of mutable variables has other intangible expressional costs, so it's not a simple no-brainer. But yes, if people are locked into a mindset that says such changes couldn't improve performance, they'd be surprised. Ultimately, I prefer to design languages around how people want to express things, and I like occasionally doing mutation even if it's not common, so I like languages that allow it and don't mind if there's a bit of a penalty for it or if one says "don't do this a lot because it's not aesthetic or not efficient or whatever".

                                To make a really crude analogy, one has free speech in a society not to say the ordinary things one needs to say. Those things are favored speech regardless because people want a society where they can do ordinary things. Free speech is everything about preserving the right to say things that are not popular. So it is not accidental that there are controversies about it. But it's still nice to have it in those situations where you're outside of norms for reasonable reasons. ๐Ÿ™‚

                                dougmerritt@mathstodon.xyzD 1 Reply Last reply
                                0
                                • wrog@mastodon.murkworks.netW wrog@mastodon.murkworks.net

                                  @kentpitman @screwlisp @cdegroot @ramin_hal9001

                                  There were other weirdnesses as well.

                                  Even if GC saves you the horror of referencing freed storage, or freeing stuff twice, you still have to worry about memory leaks and moreover, dropping references as fast as you can matters

                                  With copying GC, leaks are useless shit that has to be copied -- yes it eventually ends up in an old generation but until then it's getting copied -- and copying is where generational GC is doing work, and it's stuff unnecessarily surviving to the medium term that hurts you the most (generational GC *relies* on stuff becoming garbage as quickly as possible)

                                  And so, tracking down leaks and finding places to put in weak pointers started mattering more...

                                  4/3

                                  kentpitman@climatejustice.socialK This user is from outside of this forum
                                  kentpitman@climatejustice.socialK This user is from outside of this forum
                                  kentpitman@climatejustice.social
                                  wrote last edited by
                                  #33

                                  @wrog @screwlisp @cdegroot @ramin_hal9001

                                  Thanks for this detailed reply. Lotta good stuff there. Also thanks especially for indulging the improper fraction. I mostly do not use the fractional labeling for posts for fear of that scenario. Sometimes you promised to stop and then realize you want to keep going and feel impeded. I'm glad you kept on.

                                  wrog@mastodon.murkworks.netW 1 Reply Last reply
                                  0
                                  • kentpitman@climatejustice.socialK kentpitman@climatejustice.social

                                    @wrog @screwlisp @cdegroot @ramin_hal9001

                                    Thanks for this detailed reply. Lotta good stuff there. Also thanks especially for indulging the improper fraction. I mostly do not use the fractional labeling for posts for fear of that scenario. Sometimes you promised to stop and then realize you want to keep going and feel impeded. I'm glad you kept on.

                                    wrog@mastodon.murkworks.netW This user is from outside of this forum
                                    wrog@mastodon.murkworks.netW This user is from outside of this forum
                                    wrog@mastodon.murkworks.net
                                    wrote last edited by
                                    #34

                                    @kentpitman @screwlisp @cdegroot @ramin_hal9001

                                    yeah, usually I start with 1/n and then edit all of the n's later, but today I was lazy

                                    1 Reply Last reply
                                    0
                                    • kentpitman@climatejustice.socialK kentpitman@climatejustice.social

                                      @ramin_hal9001 @screwlisp @wrog @dougmerritt @cdegroot

                                      The LispM did a nice thing (at some tremendous cost in hardware, I guess, but useful in the early days) by having various kinds of forwarding pointers for this. At least you knew you were going to incur overhead, though, and pricing it properly at least said there was a premium for not side-effecting and tended to cause people to not do it. And the copying GC could fix the problem eventually, so you didn't pay the price forever, though you did pay for having such specific hardware or for cycles in systems trying to emulate that which couldn't hide the overhead cost. I tend to prefer the pricing model over the prohibition model, but I see both sides of that.

                                      If my memory is correct (so yduJ or wrog please fix me if I goof this): MOO, as a language, is in an interesting space in that actual objects are mutable but list structure is not. This observes that it's very unlikely that you allocated an actual object (what CL would call standard class, but the uses are different in MOO because all of those objects are persistent and less likely to be allocated casually, so less likely to be garbage the GC would want to be involved in anyway).

                                      I always say "good" or "bad" is true in a context. It's not true that side effect is good or bad in the abstract, it's a property of how it engages the ecology of other operations and processes.

                                      And, Ramin, the abolishing of mutable variables has other intangible expressional costs, so it's not a simple no-brainer. But yes, if people are locked into a mindset that says such changes couldn't improve performance, they'd be surprised. Ultimately, I prefer to design languages around how people want to express things, and I like occasionally doing mutation even if it's not common, so I like languages that allow it and don't mind if there's a bit of a penalty for it or if one says "don't do this a lot because it's not aesthetic or not efficient or whatever".

                                      To make a really crude analogy, one has free speech in a society not to say the ordinary things one needs to say. Those things are favored speech regardless because people want a society where they can do ordinary things. Free speech is everything about preserving the right to say things that are not popular. So it is not accidental that there are controversies about it. But it's still nice to have it in those situations where you're outside of norms for reasonable reasons. ๐Ÿ™‚

                                      dougmerritt@mathstodon.xyzD This user is from outside of this forum
                                      dougmerritt@mathstodon.xyzD This user is from outside of this forum
                                      dougmerritt@mathstodon.xyz
                                      wrote last edited by
                                      #35

                                      @kentpitman
                                      > Ultimately, I prefer to design languages around how people want to express things, and I like occasionally doing mutation even if it's not common, so I like languages that allow it and don't mind if there's a bit of a penalty for it or if one says "don't do this a lot because it's not aesthetic or not efficient or whatever".

                                      Me too -- although I remain open to possibilities. Usually such want me to switch paradigms, though, not just add to my toolbox.

                                      @ramin_hal9001 @screwlisp @wrog @cdegroot

                                      ramin_hal9001@fe.disroot.orgR 1 Reply Last reply
                                      0
                                      • kentpitman@climatejustice.socialK kentpitman@climatejustice.social

                                        @dougmerritt @screwlisp @cdegroot @ramin_hal9001

                                        > it seems you are saying that Lisp macros aren't so bad if their use is constrained to safe uses

                                        Well, what I'm saying isn't formal, and that in itself bugs some people. But the usual criticism of the CL system isn't that "people have to be careful", it's that "ordinary use is not safe". But there's safe and then there's safe.

                                        There is a sense in which C is objectively less safe than, say, Python or Lisp. And there is a sense in which people who write languages that aspire to more proofs think those languages still are not safe. So there's a bit of a continuum here that makes terminology tricky, so I have to make some assumptions that are fragile because some after-the-fact dodging can be done where critics do not acknowledge the incremental strengths, they just keep pointing out other problems as if that's what they meant all along.

                                        In scheme, and ignoring that you could do this functionally, writing a macro foo that takes an argument and yields the list of that argument can't look like `(list ,thing) because if used in some some situation like (define (bar list) (foo list)) you would fall victim to namespace clashes. And so scheme people dislike this paradigm. But even without careful planning, the same probably is FAR LESS likely to happen in CL because:

                                        Parameters that might get captured are usually in the variable namespace. You CAN bind functions, but it's rare, and it's super-rare for the names chosen to be things that would be the name of a pre-defined function. you'd have to be in some context where someone had done (flet ((list ...)) ....) for the list function to be bound to something unexpected, and even then you're not intended to bind list to something unexpected for other reasons, mainly that the symbol list is shared.

                                        I allege that in the natural course of things, it's FAR more rare for the expansion of a macro to ever contain something that would get unexpectedly captured, for reasons that do not exist in the scheme world. Formally, yes, there is still a risk, but what makes this such an urgency in the Scheme world are the choices to have a Lisp1 and the choice to have no package system. Each of these things creates an insulation. In practice, the functional part of the CL world does not vary, as uses of FLET are very rare. And it's equally rare for a macro to expand into free references that are not functional references.

                                        Also, the CL world has gensyms easily available, and CL systems often have other mechanisms that package up their use to be easy. In the Scheme world, there is no gensym and the language semantics is not defined on objects but on the notation itself. This makes things hard to compare, but it doesn't make it easy to see how package separation also eliminates a broad class of the surprise, because usually you know what's in your own package and aren't affected by what's in someone else's where in scheme symbols are symbols and it's far more dangerous to just be relying on lexical context to sort everything out.

                                        So yes, CL is less dangerous if you limit yourself, but also it's less dangerous because a lot of times you don't have to think hard about limiting yourself. The languages features it has create naturally-more-safe situations. Note I am making a relative, not an absolute measurement of safety. I'm saying if CL were full of the conflict opportunities that Scheme is, we'd have rushed to use hygiene, too. But mostly it wasn't, so no one felt the urge.

                                        dougmerritt@mathstodon.xyzD This user is from outside of this forum
                                        dougmerritt@mathstodon.xyzD This user is from outside of this forum
                                        dougmerritt@mathstodon.xyz
                                        wrote last edited by
                                        #36

                                        @kentpitman
                                        On the one hand, that is all well said.

                                        On the other hand, I always have some nitpicky reply. ๐Ÿ™‚

                                        (On the gripping hand -- no, I'll stop there)

                                        You're talking about what is common and what is rare, and I can see why such was your overriding concern.

                                        But I feel like I'm always the guy who ends up needing to fix the rare cases that then happen in real life.

                                        For instance, when implementing a language that is wildly different than the implementation language -- "rare" seems to come up a lot there.

                                        And also many times when I am bending heaven and earth to serve my will despite the obstinacy of the existing software infrastructure. "Just don't do that", people say.

                                        It is indeed a lot like the needs of the formal verification by proof community, that is looking for actual math proofs, versus mundane everyday user needs.

                                        Humpty Dumpty said "The question is, which is to be the master -- that's all" ("Through The Looking Glass", by Lewis Carroll).

                                        Here, perhaps the master is which community you aim to serve.

                                        @screwlisp @cdegroot @ramin_hal9001

                                        kentpitman@climatejustice.socialK 1 Reply Last reply
                                        0
                                        • dougmerritt@mathstodon.xyzD dougmerritt@mathstodon.xyz

                                          @kentpitman
                                          On the one hand, that is all well said.

                                          On the other hand, I always have some nitpicky reply. ๐Ÿ™‚

                                          (On the gripping hand -- no, I'll stop there)

                                          You're talking about what is common and what is rare, and I can see why such was your overriding concern.

                                          But I feel like I'm always the guy who ends up needing to fix the rare cases that then happen in real life.

                                          For instance, when implementing a language that is wildly different than the implementation language -- "rare" seems to come up a lot there.

                                          And also many times when I am bending heaven and earth to serve my will despite the obstinacy of the existing software infrastructure. "Just don't do that", people say.

                                          It is indeed a lot like the needs of the formal verification by proof community, that is looking for actual math proofs, versus mundane everyday user needs.

                                          Humpty Dumpty said "The question is, which is to be the master -- that's all" ("Through The Looking Glass", by Lewis Carroll).

                                          Here, perhaps the master is which community you aim to serve.

                                          @screwlisp @cdegroot @ramin_hal9001

                                          kentpitman@climatejustice.socialK This user is from outside of this forum
                                          kentpitman@climatejustice.socialK This user is from outside of this forum
                                          kentpitman@climatejustice.social
                                          wrote last edited by
                                          #37

                                          @dougmerritt @screwlisp @cdegroot @ramin_hal9001

                                          Well, I'm just trying to explain why hygiene seems more like a crisis to the Scheme community than it did to the CL community, who mostly asked "why is this a big deal?". It is a big deal in Scheme. And it's not because of the mindset, it's because different designs favor different outcomes.

                                          The CL community would have been outraged if we overcomplicated macros, while the Scheme community was grateful for safety they actually perceived a need for, in other words.

                                          So yes, "the master is which community you aim to serve". We agree on that. ๐Ÿ™‚

                                          ramin_hal9001@fe.disroot.orgR 1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups