Bug with local exec in a module
Local exec works fine in terraform script in root, but once it is in any module, it is throwing weird errors.
To reproduce this issue create following template:
|main.tf
|/module/main.tf
With this template, first local exec finishes successfully, but the second one fails with:
32 Replies
<#1304458286367637535>
Category
Bug report
Product
Coder OSS (v2)
Platform
Linux
Logs
Please post any relevant logs/error messages.
when using this directory on both my local machine and INSIDE docker coder container, terraform apply finishes with no errors and both list content of ssh dir.
hey @Andrej, what goal are you trying to achieve by using
local-exec
?we have this project to setup kubernetes using openstack and ansible. what im trying to do is run all of this in coder, running ansible with some other scripts to set it up. The code i sent above is just to reproduce the issue.
I wanted to move all of the code to a module, so that coder template consists only coder parameters, module resouce and coder metadata
i managed to do this kubernetes setup when i ran the local-exec in the root, but once moved to a module, it stopped working
so if I understand correctly, you're running the ansible CLI and/or other scripts using local-exec within Coder?
just to make sure, you are aware that this will run the command within the Coder host, and not within the workspace right?
yes, im aware
@Phorcys do you think i should use
startup_script
for this instead?startup_script
will run within the workspace so if that's what you want yes
the issue is that i'm having trouble understanding what you're really doing
the idea is to run an ansible playbook within the context of a Coder template to provision a kubernetes instance?yeah basically
is it to provision a kubernetes test cluster per workspace or to actually manage production clusters ?
first one
and why not have terraform do that provisioning instead of the ansible playbook?
because we are creating the kubernetes clusters on the openstack cloud
Terraform has a provider for OpenStack
i know, i am using it to first setup openstack VMs and then ansible to setup kubernetes
ah okay i get it now
I think you might want to run and apply the playbook on the same machine
i know it's not ideal but it's probably the best solution because using
local-exec
providers isn't really something you want to doyeah, those were the 2 options 😄
do you think its possible to do it in the startup script though?
yeah sorry it took me a bit to understand as it's pretty unusual
dont worry, i really appriciate your effort
well all you should have to do is install the ansible CLI in the workspace and run the playbook
yes
so you can either install the Ansible CLI within your base image or do it inside the startup_script
i have ansible-playbook inside the base image now
okay
do you host the playbook in some kind of git repository or something?
that would be the easiest way to get it into the workspace, and then you just have to run it
yes
but im struggling to understand how the startup script works. I used it to do basic command like
ls
or sleep
, but its creation finished in 0 secondsyes that makes sense
basically the startup script runs as soon as the agent gets executed on a machine
so you can't wait for it in the Terraform side of things, otherwise it would just stall forever, so Coder is the one doing the waiting after the Terraform stage has finished
and the agent is executed on a machine by the init_script?
yes
basically you would provision a VM, then run the agent's init script in the VM, which would in turn run your
startup_script
and associated coder_script
resourcesdamn, okay i didnt get this from the docs really 😄 let me absorb all this, try it and ill post the result here. thank you sir
oh, is there any parts of the docs that you found unclear or that you think should be edited to make it more understandeable?
now that i understand what is going on i will have a look again and report back to you
thanks a lot!