Broadcast camera render over network

We have built an app which has a 3D model on the left and user interface to the right. The changes that the user makes on the interface gets reflected on the 3D model e.g if he selects pants then the a pant gets instantiated on the 3D model.
But, I would like to see the 3D model in a separate window and the Interface in the separate window.
I understand that this has to be done via Networking. Which approach should i go for? Should i use Network.Instantiate(This would be complex since it has many dependencies) or send rendered camera texture?

I would suggest you to use Network.Instantiate, how much ever complex it gets… dont even think about sending rendered camera texture. Practically it is impossible to send render texture per frame.

EG: your game is running at 30 FPS and render texture size is 1 mb, you would have to send 30 mb per second which is practically impossible and also dont forget that you have to receive 30 mb from other end. Making it 60MB traffic minimum per second.

Instead what you can do is use Network.Instantiate() and and make sure that everything is instanitated by the server. for example if the action is from client, a request should be send over to the server for instantiating and server would instantiate it (not the only way, just a suggestion.)

The first rule of networking is to keep the traffic as minimum as possible.