Thank you for the answer.
It is relatively small (e.g. 30 000 points) . I skipped my code just to understand where it is slowing down. This snippet summarizes it quite well considering I am transferring the point coordinates. Currently the data is modified, meaning the output array is changing including its length.
def rpc_test():
arr = np.random.rand(5000, 18)
return arr
the best workaround would be to use compas_cloud
instead of compas.rpc
. it allows storing some of the data on the server side and transmitting only what is strictly needed.
for example, if the objects being transmitted have a data structure that doesn’t really change, you could store the structure of the data on the server, and send only the vertex coordinates.
whether this is possible or not depends of course on your specific use case.
The data changes. In a small summary: I send polylines with attributes and I get new polylines with more points. In general, a completely new array of polylines. I also transfer a list of integers for attributes.
@Li_Chen perhaps you could point to an example that does this?
I would be really great.
it is also important to remember that packages such as compas.rpc
are there for convenience, to allow you to work on research in Rhino without the typical constraints of IronPython so you can use the same packages as when working in for example VSCode.
perhaps for “production” code and applications we could consider making a compiled COMPAS plugin with utility functions that are (potentially) a bit faster at some of the heavy-lifting tasks, like serialising and deserialising large amounts of data…
Is there any documentation for compiling COMPAS plugin or it is standard what Rhino3D offers? I only know a few hints from the McNeel forum. For the same code the .NET compiled components takes around 22 ms instead of 1.4 sec. But even if it is slower, I see the big benefit for a faster coding loops. I will also try nanobind GitHub - wjakob/nanobind: nanobind: tiny and efficient C++/Python bindings instead of pybind11, but the problem is not in C++ wrapping. I just really would like to remove the overhead of CPython to IronPython serialization as much as possible which is already a lot to ask, knowing that the nature of Rhino is mainly between C++ and .NET.
Meanwhile I checked RhinoCode, it allows to runs numpy, other C++ libraries, also compas for Rhino 8 WIP, but it is still in the development phase and is a bit buggy, so I discarded this option. I checked GHPythonRemote too, but it runs only on Python 2.7 so I discarded the option too.
It would be really great if you could tell me more about compas_cloud . In the meantime I will try to follow instruction in the readme of GitHub - compas-dev/compas_cloud: COMPAS Remote Procedure Calls using websockets