When you start the execution on a , it is literally that – a different , which has no knowledge or access to the files on your local – it has its own filesystem, and your script tells it “set the filename to the local hard disk, the user directory concatenated with this subpath”.

I see you’re using a hub, so – presumably – this is an on-prem environment on which you have control.

So here are three solutions, from the least preferable, to the most.

Option 1 – local copies

I see you’re using a system var user.dir – copy the files to it, and they will resolve. On every hub node. You are in for a lot of copying. When the system user used for starting the hub node changes – you are in for a lot of copying. When a file is changed, or a new one is added to the dataset – well, you get my point.

Pros – well, it’s a solution.
Cons – pretty much, everything.

Option 2 – shared location

With the presumption access is available to the hub, create a network share, mount it on all hub machines with the same uri – for example \auto-shares, and use it as the base in the script.

Pros –

  • one-time setup (don’t forget to do it for new nodes);
  • presumably fast transfer speed;
  • no extra changes for the test harness system (see option 3).

Cons –

  • access required;
  • doesn’the scale optimally;
  • can’t be used for cloud hubs (cons #1)

Option 3 – download the needed files

This one might seem tricky at first, but in general is the most robust. In summary, before starting the test, download the needed files from a know internet location, to user.dir – thus the tests will always find it. This is best done on run initialization/setup level – if it fails, no need to run the tests.

The location can be anything you can get your hands on (and control) – a test http server accessible from outside, the corp site :), or even a file sharing service (generally, not a good idea…)

Pros –

  • applicable to practically everything;
  • can be used for cloud hubs (browserstack for sure allows saving files locally during a run);
  • set-and-forget (in theory 🙂 – there’s always maintenance involved).

Cons –

  • requires well-thought out changes to the test harness/framework;
  • if you’ll be downloading the files by the browser (and you should, so this works in cloud hubs), they have to be started with profiles for automatic file download and overwriting;
  • security- and I cannot stress this high enough – the files should not contain and secret/private/proprietary information, as they are publicly accessible by definition.



Source link https://sqa.stackexchange.com/questions/25813/-to---in----in-remote-machine-using--

LEAVE A REPLY

Please enter your comment!
Please enter your name here