Having an issue where Windows clients are automatically updating to new versions but Mac clients aren’t. Our slaves are all Windows but clients (used to submit jobs via submission scripts mostly but some via Monitor). We are running beta’s if that changes anything. Only solution has been a manual install on the client machines. Is this a known issue or just the cost of a mixed environment?
Joel
MBA Productions
The update process is triggered when an application is run via the Launcher. On Windows we’re sneaky and all the applications start via Launcher by default (check the start menu shortcuts). On the Mac, they’re run directly, so the version difference check isn’t kicking in.
I’ll talk with the Devs about this one.
So, we’re not actaully sure why, but when we try to run an application from the launcher on the Mac, it crashes. My guess is it’s a security feature, but I haven’t done enough Mac development to know.
“The applications just crash. It seems like the .app packages don’t like to launch other applications (even though it’s just a shell script under the hood).”
Also, interesting to note: OS X clients will need to be upgraded manually for 6.1 using the installers. There were some permissions problems in the folder structure during the 6.0 install.
6.1 to future 6.x versions should not have this problem.