Associative Arrays in Bash (AKA Key-Value Dictionaries)

Поділитися
Вставка
  • Опубліковано 22 лис 2024

КОМЕНТАРІ • 12

  • @derekfarealz
    @derekfarealz Рік тому +3

    "whoa, whats going on. life is getting hard" 😂

    • @NickJanetakis
      @NickJanetakis  Рік тому +2

      You're lucky to be alive when using Vim after putting your hands down in the wrong spot.

  • @alex.prodigy
    @alex.prodigy Рік тому +1

    haven't used associative arrays in bash for a long time, i've found that many people just avoid them even if they come in handy sometimes
    maybe they avoid it for POSIX compatibility reasons ?

    • @NickJanetakis
      @NickJanetakis  Рік тому +2

      It's possible but Bash 4+ has been around for almost 15 years. I've never used a system where it wasn't available.

  • @vitormelo22
    @vitormelo22 Рік тому +1

    Very userful.

  • @nnutipa
    @nnutipa Рік тому

    I didn't fully understand the real world example. Why not use just plain list there? I mean, if you're looping over all values it is the same as looping over the list, so you don't use keys of the "map". Why not replace for loops with gnu parallel?

    • @NickJanetakis
      @NickJanetakis  Рік тому

      The helm example requires having (2) pieces of information. The key (ie. sealed-secrets, eks, argo-cd, etc.) and the value (the URL). A list would only let you store 1 of those. I don't know about the parallel command but I went for running the commands sequentially because I wanted to keep the output grouped up by command so it's human readable. Those helm commands could have been run in the background with & and finished faster but then the output would be out of order and harder to read. For this specific case, the human readable output is the purpose of the command, not raw execution speed. It's something I run about once a week and it finishes in ~5 seconds.

    • @nnutipa
      @nnutipa Рік тому

      @@NickJanetakis thanks, just a couple of lines from the documentation:
      If you write loops in shell, you will find GNU parallel may be able to replace most of the loops and make them run faster by running several jobs in parallel.
      GNU parallel makes sure output from the commands is the same output as you would get had you run the commands sequentially. This makes it possible to use output from GNU parallel as input for other programs.

    • @NickJanetakis
      @NickJanetakis  Рік тому

      @@nnutipa Oh nice, I'll definitely check it out. It also sounds like a potential future video idea. Although in this case, how would you escape the loop? You still need to call 6 different helm commands with different values. Are you proposing hard coding those commands with their arguments instead of the array?

    • @nnutipa
      @nnutipa Рік тому +1

      Hi @@NickJanetakis. Sorry for the late reply, I've missed your comment somehow. In my opinion, using associative arrays only makes sense if you want to pick only some random items from them. If you want to iterate over all items it sounds to me like data processing task. In this case, I would use a structured file and parallel or simple "while read" loop if you don't want to install new software:
      parallel --colsep '\t' 'helm repo add {1} {2}; helm repo update {1}; helm search repo {1}' :::: structured_file
      while read repo_name url; do
      echo "$repo_name : $url"
      done < "structured_file"
      It allows you to have more than two pieces of information. For example, you want to have not only repo_name and url but additionally maintainer_name or something. How would you do this using bash arrays? "While read" could guarantee execution order. Also, another weird limitation is bash arrays can not be exported, unlike usual bash variables.

  • @basalduat
    @basalduat 11 місяців тому

    Stop talking so fast. You are not in a race. Also, use a much larger font size, so we can see your code. 35pt font size is better. Stop talking continuously!!! Speak in short phrases. Let the viewer absorb what you have said before you continue. Merry Xmas!