Assume mainstream adoption as used by around 7% of all github projects

Personally, I’d like to see Nim get that growth.

  • UraniumBlazer@lemm.ee
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    12
    ·
    1 year ago

    I don’t know why you’re being downvoted, but this could truly be the future of programming languages. We don’t have to manually compile everything to assembly today, do we? Imagine simply using English for pseudocode, with an AI compiler that writes the most performant code… How much would that speed up development time? Noone would need to know different languages… The learning curve for programming relatively basic shit would be low.

    I dunno, but I’ve seen a lot of unecessary hate for AI in the left leaning communities…

    • robinm@programming.dev
      link
      fedilink
      arrow-up
      14
      arrow-down
      1
      ·
      1 year ago

      Syntax has never really be an issue. The closest thing to plain english programming are legal documents and contracts. As you can see they are horrible to understand but that the only way to correctly specify exactly what you want. And code is much better at it. Another datapoint are visual languages like lego mindstorm or LabView. It’s quite easy to do basic things, but it doesn’t scale at all.

      • UraniumBlazer@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        1 year ago

        Syntax has never really be an issue.

        But it has tho… For example, I do not know rust. I want to add the notifications functionality to Lemmy. Lemmy is in rust. To implement this relatively simply api, I need to learn rust to a degree. Then, I need to look at Lemmy’s file structure to understand the project further to actually do what I want to do. What if this all could be abstracted by me simply saying “post xyz to the expo-notifications server whenever someone messages someone.” An AI English-to-rust interpreter could easily do this.

        The closest thing to plain english programming are legal documents and contracts. As you can see they are horrible to understand but that the only way to correctly specify exactly what you want.

        This is what would define the smartness of the AI, wouldn’t it? Your project manager doesn’t tell you exactly what they want. You have the brains to interpret what they mean and do stuff accordingly, correct?

        • jasory@programming.dev
          link
          fedilink
          arrow-up
          1
          arrow-down
          2
          ·
          1 year ago

          This requires many assumptions that you or any computational system have no formal reason to make. Having an interpreter that just guesstimates exactly how you want the program structured, is going to run into problems when you, say want to extend the program.

    • coloredgrayscale@programming.dev
      link
      fedilink
      arrow-up
      6
      arrow-down
      1
      ·
      1 year ago

      A compiler has mostly fixed rules for translation. The English language often is ambiguous and there are many ways to implement something based on a verbal description.

      Programming by using the ai as a “compiler” would likely lead to many bugs that will be hard to impossible to trace without knowing the underlying implementation. But hitting compile again may lead to an accidental correct implementation and you’d be none the wiser why the test suddenly passes.

      It’s ok as an assistant to generate boilerplate code, and warn you about some bugs / issues. Maybe a baseline implementation.

      But by the time you’ve exactly described what and how you want it you may as well just write some higher level code.

      • UraniumBlazer@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        A compiler has mostly fixed rules for translation.

        Some compilers are simple, while some are complicated. An AI compiler would of course be very complicated. However, it still would have “fixed rules”. It’s just that these rules would be decided by itself. If u r a software dev, u r also an English-to-xyz-language-compiler. You do what your client tells u to do more or less correctly, right? Junior devs do what senior devs tell them to do kinda correctly, right? An AI compiler would be the same thing.

        Programming by using the ai as a “compiler” would likely lead to many bugs that will be hard to impossible to trace without knowing the underlying implementation.

        Bugs would be likely if your AI compiler was dumb. The probability of bugs would reduce drastically if ur AI compiler was trained more/on better data.

        It’s ok as an assistant to generate boilerplate code, and warn you about some bugs / issues. Maybe a baseline implementation.

        That is the state of AI today. What you are describing are the capabilities of current AI models. However, I cannot see how this is a criticism of the idea of AI compilers themselves.

        But by the time you’ve exactly described what and how you want it you may as well just write some higher level code.

        Again. The smarter your model, the more you can abstract your stuff.