Parallel processing remote and server logs collection / scp
Info
Diese Frage ist geschlossen. Öffnen Sie sie erneut, um sie zu bearbeiten oder zu beantworten.
Ältere Kommentare anzeigen
I would like to stream a file using HTTP endpoint / scp so that I can collect the logs from a remote server.
Is there a way ,parallel.internal.logging.enableClientLogging or parallel can help me connecting the input and output model. I have model where I use ugetdir to collect logs and then process the logs.
But now I have larger system where logs should be downloaded and processed.
I accept a HTTP POST request which will contain the following JSON in its body:
{
event,
entity,
actor,
machines,
tests,
options
}
I'II call back the HTTP domain name mentioned in the actor, so that it will give back the JSON of all the log files generated for that job
Can you please suggest me how (
JobStorageLocation
) can do this ?
Thank you!
6 Kommentare
Edric Ellis
am 22 Jun. 2021
I'm not sure I've understood quite what's going on here. Are you asking for a way to have the workers write out files in JobStorageLocation, and then get the client to pick them up afterwards? If so, Parallel Computing Toolbox doesn't have any APIs to help with that. If the client and workers can see any sort of shared filesystem, your best bet is to put the files there.
Life is Wonderful
am 23 Jun. 2021
Bearbeitet: Life is Wonderful
am 28 Jun. 2021
Edric Ellis
am 23 Jun. 2021
I'm afraid I still don't understand where (i.e. on which host, caused by which process) the files are being generated. There is no portable way to store arbitrary files inside JobStorageLocation, the only documented API is to use properties of the Job and Task objects. If the data needs to be transferred from client to worker, you could read the files and put the resulting data structure into some property of the Job or Task; if the data needs to be transferred from worker to client, you could read the file(s) and return the value from your task's function. So, you could do something like this:
% Example sending JSON data to a worker for processing
jsonFile = 'somePathAtClient.json';
jsonData = jsondecode(fileread(jsonFile));
job = batch(@processJsonData, 1, {jsonData});
function out = processJsonData(jsonData)
out = doStuff(jsonData);
end
Or, in the other direction:
job = batch(@returnJsonData, 1, {args..});
function out = returnJsonData(args..)
% do stuff ...
jsonFile = 'somePathOnTheWorker.json';
jsonData = jsondecode(fileread(jsonFile));
out = doStuff(jsonData);
end
I don't know if that helps.
Life is Wonderful
am 24 Jun. 2021
Bearbeitet: Life is Wonderful
am 24 Jun. 2021
Edric Ellis
am 24 Jun. 2021
Sorry, I don't know anything about MATLAB Production Server or matlab.net.http. You might be better off either asking a separate question about that piece.
Life is Wonderful
am 24 Jun. 2021
Antworten (0)
Diese Frage ist geschlossen.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!