Compare commits

...

5 Commits

Author SHA1 Message Date
Romain Vimont
ae87885ffa hidpiscale 2018-03-05 15:14:09 +01:00
Romain Vimont
65021416b3 Add developer documentation
And update README.
2018-03-05 15:10:37 +01:00
Romain Vimont
305e76ee6e Revert "Enable high dpi support"
Just enabling this flag breaks mouse location values.

This reverts commit 64efe2c07d.
2018-03-05 15:10:37 +01:00
Romain Vimont
ef0029d10e Shutdown sockets before closing
The server socket does not release the port it was listening for if we
just close it: we must also shutdown it.
2018-03-05 15:10:37 +01:00
Romain Vimont
36c68c4802 Fix scroll wheel mouse position
SDL_MouseWheelEvent does not provide the mouse location, so we used
SDL_GetMouseState() to retrieve it.

Unfortunately, SDL_GetMouseState() returns a position expressed in the
window coordinate system while the position filled in SDL events are
expressed in the renderer coordinate system. As a consequence, the
scroll was not applied at the right position on the device.

Therefore, convert the coordinate system.

See <https://stackoverflow.com/questions/49111054/how-to-get-mouse-position-on-mouse-wheel-event>.
2018-03-05 14:59:02 +01:00
9 changed files with 358 additions and 14 deletions

238
DEVELOP.md Normal file
View File

@@ -0,0 +1,238 @@
# scrcpy for developers
## Overview
This application is composed of two parts:
- the server (`scrcpy-server.jar`), to be executed on the device,
- the client (the `scrcpy` binary), executed on the host computer.
The client is responsible to push the server to the device and start its
execution.
Once the client and the server are connected to each other, the server initially
sends device information (name and initial screen dimensions), then starts to
send a raw H.264 video stream of the device screen. The client decodes the video
frames, and display them as soon as possible, without buffering, to minimize
latency. The client is not aware of the device rotation (which is handled by the
server), it just knows the dimensions of the video frames.
The client captures relevant keyboard and mouse events, that it transmits to the
server, which injects them to the device.
## Server
### Privileges
Capturing the screen requires some privileges, which are granted to `shell`.
The server is a Java application (with a [`public static void main(String...
args)`][main] method), compiled against the Android framework, and executed as
`shell` on the Android device.
[main]: server/src/main/java/com/genymobile/scrcpy/Server.java#L61
To run such a Java application, the classes must be [_dexed_][dex] (typically,
to `classes.dex`). If `my.package.MainClass` is the main class, compiled to
`classes.dex`, pushed to the device in `/data/local/tmp`, then it can be run
with:
adb shell CLASSPATH=/data/local/tmp/classes.dex \
app_process / my.package.MainClass
_The path `/data/local/tmp` is a good candidate to push the server, since it's
readable and writable by `shell`, but not world-writable, so a malicious
application may not replace the server just before the client executes it._
Instead of a raw _dex_ file, `app_process` accepts a _jar_ containing
`classes.dex` (e.g. an [APK]). For simplicity, and to benefit from the gradle
build system, the server is built to an (unsigned) APK (renamed to
`scrcpy-server.jar`).
[dex]: https://en.wikipedia.org/wiki/Dalvik_(software)
[apk]: https://en.wikipedia.org/wiki/Android_application_package
### Hidden methods
Albeit compiled against the Android framework, [hidden] methods and classes are
not directly accessible (and they may differ from one Android version to
another).
They can be called using reflection though. The communication with hidden
components is provided by [_wrappers_ classes][wrappers] and [aidl].
[hidden]: https://stackoverflow.com/a/31908373/1987178
[wrappers]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/server/src/main/java/com/genymobile/scrcpy/wrappers
[aidl]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/server/src/main/aidl/android/view
### Threading
The server uses 2 threads:
- the **main** thread, encoding and streaming the video to the client;
- the **controller** thread, listening for _control events_ (typically,
keyboard and mouse events) from the client.
Since the video encoding is typically hardware, there would be no benefit in
encoding and streaming in two different threads.
### Screen video encoding
The encoding is managed by [`ScreenEncoder`].
The video is encoded using the [`MediaCodec`] API. The codec takes its input
from a [surface] associated to the display, and writes the resulting H.264
stream to the provided output stream (the socket connected to the client).
[`ScreenEncoder`]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/server/src/main/java/com/genymobile/scrcpy/ScreenEncoder.java
[`MediaCodec`]: https://developer.android.com/reference/android/media/MediaCodec.html
[surface]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/server/src/main/java/com/genymobile/scrcpy/ScreenEncoder.java#L63-L64
On device [rotation], the codec, surface and display are reinitialized, and a
new video stream is produced.
New frames are produced only when changes occur on the surface. This is good
because it avoids to send unnecessary frames, but there are drawbacks:
- it does not send any frame on start if the device screen does not change,
- after fast motion changes, the last frame may have poor quality.
Both problems are [solved][repeat] by the flag
[`KEY_REPEAT_PREVIOUS_FRAME_AFTER`][repeat-flag].
[rotation]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/server/src/main/java/com/genymobile/scrcpy/ScreenEncoder.java#L89-L92
[repeat]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/server/src/main/java/com/genymobile/scrcpy/ScreenEncoder.java#L125-L126
[repeat-flag]: https://developer.android.com/reference/android/media/MediaFormat.html#KEY_REPEAT_PREVIOUS_FRAME_AFTER
### Input events injection
_Control events_ are received from the client by the [`EventController`] (run in
a separate thread). There are 5 types of input events:
- keycode (cf [`KeyEvent`]),
- text (special characters may not be handled by keycodes directly),
- mouse motion/click,
- mouse scroll,
- custom command (e.g. to switch the screen on).
All of them may need to inject input events to the system. To do so, they use
the _hidden_ method [`InputManager.injectInputEvent`] (exposed by our
[`InputManager` wrapper][inject-wrapper]).
[`EventController`]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/server/src/main/java/com/genymobile/scrcpy/EventController.java#L70
[`KeyEvent`]: https://developer.android.com/reference/android/view/KeyEvent.html
[`MotionEvent`]: https://developer.android.com/reference/android/view/MotionEvent.html
[`InputManager.injectInputEvent`]: https://android.googlesource.com/platform/frameworks/base/+/oreo-release/core/java/android/hardware/input/InputManager.java#857
[inject-wrapper]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/server/src/main/java/com/genymobile/scrcpy/wrappers/InputManager.java#L27
## Client
The client relies on [SDL], which provides cross-platform API for UI, input
events, threading, etc.
The video stream is decoded by [libav] (FFmpeg).
[SDL]: https://www.libsdl.org
[libav]: https://www.libav.org/
### Initialization
On startup, in addition to _libav_ and _SDL_ initialization, the client must
push and start the server on the device, and open a socket so that they may
communicate.
Note that the client-server roles are expressed at the application level:
- the server _serves_ video stream and handle requests from the client,
- the client _controls_ the device through the server.
However, the roles are inverted at the network level:
- the client opens a server socket and listen on a port before starting the
server,
- the server connects to the client.
This role inversion guarantees that the connection will not fail due to race
conditions, and avoids polling.
Once the server is connected, it sends the device information (name and initial
screen dimensions). Thus, the client may init the window and renderer, before
the first frame is available.
To minimize startup time, SDL initialization is performed while listening for
the connection from the server (see commit [90a46b4]).
[90a46b4]: https://github.com/Genymobile/scrcpy/commit/90a46b4c45637d083e877020d85ade52a9a5fa8e
### Threading
The client uses 3 threads:
- the **main** thread, executing the SDL event loop,
- the **decoder** thread, decoding video frames,
- the **controller** thread, sending _control events_ to the server.
### Decoder
The [decoder] runs in a separate thread. It uses _libav_ to decode the H.264
stream from the socket, and notifies the main thread when a new frame is
available.
There are two [frames] simultaneously in memory:
- the **decoding** frame, written by the decoder from the decoder thread,
- the **rendering** frame, rendered in a texture from the main thread.
When a new decoded frame is available, the decoder _swaps_ the decoding and
rendering frame (with proper synchronization). Thus, it immediatly starts
to decode a new frame while the main thread renders the last one.
[decoder]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/app/src/decoder.c
[frames]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/app/src/frames.h
### Controller
The [controller] is responsible to send _control events_ to the device. It runs
in a separate thread, to avoid I/O on the main thread.
On SDL event, received on the main thread, the [input manager][inputmanager]
creates appropriate [_control events_][controlevent]. It is responsible to
convert SDL events to Android events (using [convert]). It pushes the _control
events_ to a blocking queue hold by the controller. On its own thread, the
controller takes events from the queue, that it serializes and sends to the
client.
[controller]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/app/src/controller.h
[controlevent]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/app/src/controlevent.h
[inputmanager]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/app/src/inputmanager.h
[convert]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/app/src/convert.h
### UI and event loop
Initialization, input events and rendering are all [managed][scrcpy] in the main
thread.
Events are handled in the [event loop], which either updates the [screen] or
delegates to the [input manager][inputmanager].
[scrcpy]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/app/src/scrcpy.c
[event loop]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/app/src/scrcpy.c#L38
[screen]: https://github.com/Genymobile/scrcpy/blob/079d750d41b7849eb1b9caaa6151ef2429581584/app/src/screen.h
## Hack
For more details, go read the code!
If you find a bug, or have an awesome idea to implement, please discuss and
contribute ;-)

View File

@@ -1,8 +1,8 @@
# scrcpy
This project displays screens of Android devices connected on USB, and allow to
control them using the host mouse and keyboard. It does not require any _root_
access. It works on _GNU/Linux_, _Windows_ and _Mac OS_.
This application provides display and control of Android devices connected on
USB. It does not require any _root_ access. It works on _GNU/Linux_, _Windows_
and _Mac OS_.
![screenshot](assets/screenshot-debian-600.jpg)
@@ -185,6 +185,23 @@ If several devices are listed in `adb devices`, you must specify the _serial_:
| enable/disable FPS counter (on stdout) | `Ctrl`+`i` |
## Why _scrcpy_?
A colleague challenged me to find a name as unpronounceable as [gnirehtet].
[`strcpy`] copies a **str**ing; `scrcpy` copies a **scr**een.
[gnirehtet]: https://github.com/Genymobile/gnirehtet
[`strcpy`]: http://man7.org/linux/man-pages/man3/strcpy.3.html
## Developers
Read the [developers page].
[developers page]: DEVELOP.md
## Licence
Copyright (C) 2018 Genymobile

View File

@@ -8,6 +8,7 @@ src = [
'src/device.c',
'src/fpscounter.c',
'src/frames.c',
'src/hidpi.c',
'src/inputmanager.c',
'src/lockutil.c',
'src/net.c',

View File

@@ -136,6 +136,7 @@ SDL_bool input_key_from_sdl_to_android(const SDL_KeyboardEvent *from,
SDL_bool mouse_button_from_sdl_to_android(const SDL_MouseButtonEvent *from,
struct size screen_size,
struct hidpi_scale *hidpi_scale,
struct control_event *to) {
to->type = CONTROL_EVENT_TYPE_MOUSE;
@@ -145,21 +146,30 @@ SDL_bool mouse_button_from_sdl_to_android(const SDL_MouseButtonEvent *from,
to->mouse_event.buttons = convert_mouse_buttons(SDL_BUTTON(from->button));
to->mouse_event.position.screen_size = screen_size;
to->mouse_event.position.point.x = (Uint16) from->x;
to->mouse_event.position.point.y = (Uint16) from->y;
Sint32 x = from->x;
Sint32 y = from->y;
hidpi_unscale_coordinates(hidpi_scale, &x, &y);
to->mouse_event.position.point.x = (Uint16) x;
to->mouse_event.position.point.y = (Uint16) y;
return SDL_TRUE;
}
SDL_bool mouse_motion_from_sdl_to_android(const SDL_MouseMotionEvent *from,
struct size screen_size,
struct hidpi_scale *hidpi_scale,
struct control_event *to) {
to->type = CONTROL_EVENT_TYPE_MOUSE;
to->mouse_event.action = AMOTION_EVENT_ACTION_MOVE;
to->mouse_event.buttons = convert_mouse_buttons(from->state);
to->mouse_event.position.screen_size = screen_size;
to->mouse_event.position.point.x = from->x;
to->mouse_event.position.point.y = from->y;
Sint32 x = from->x;
Sint32 y = from->y;
hidpi_unscale_coordinates(hidpi_scale, &x, &y);
to->mouse_event.position.point.x = (Uint16) x;
to->mouse_event.position.point.y = (Uint16) y;
return SDL_TRUE;
}

View File

@@ -3,7 +3,10 @@
#include <SDL2/SDL_stdinc.h>
#include <SDL2/SDL_events.h>
#include "common.h"
#include "controlevent.h"
#include "hidpi.h"
struct complete_mouse_motion_event {
SDL_MouseMotionEvent *mouse_motion_event;
@@ -19,12 +22,14 @@ SDL_bool input_key_from_sdl_to_android(const SDL_KeyboardEvent *from,
struct control_event *to);
SDL_bool mouse_button_from_sdl_to_android(const SDL_MouseButtonEvent *from,
struct size screen_size,
struct hidpi_scale *hidpi_scale,
struct control_event *to);
// the video size may be different from the real device size, so we need the size
// to which the absolute position apply, to scale it accordingly
SDL_bool mouse_motion_from_sdl_to_android(const SDL_MouseMotionEvent *from,
struct size screen_size,
struct hidpi_scale *hidpi_scale,
struct control_event *to);
// on Android, a scroll event requires the current mouse position

16
app/src/hidpi.c Normal file
View File

@@ -0,0 +1,16 @@
#include "hidpi.h"
void hidpi_get_scale(struct screen *screen, struct hidpi_scale *scale) {
SDL_GL_GetDrawableSize(screen->window, &scale->horizontal.num, &scale->vertical.num);
SDL_GetWindowSize(screen->window, &scale->horizontal.div, &scale->vertical.div);
}
void hidpi_unscale_coordinates(struct hidpi_scale *scale, Sint32 *x, Sint32 *y) {
// to unscale, we devide by the ratio (so num and div are reversed)
if (scale->horizontal.num) {
*x = ((Sint64) *x) * scale->horizontal.div / scale->horizontal.num;
}
if (scale->vertical.num) {
*y = ((Sint64) *y) * scale->vertical.div / scale->vertical.num;
}
}

24
app/src/hidpi.h Normal file
View File

@@ -0,0 +1,24 @@
#ifndef HIDPI_H
#define HIDPI_H
#include "common.h"
#include "screen.h"
// rational number p/q
struct rational {
int num;
int div;
};
struct hidpi_scale {
struct rational horizontal; // drawable.width / window.width
struct rational vertical; // drawable.height / window.height
};
void hidpi_get_scale(struct screen *screen, struct hidpi_scale *hidpi_scale);
// mouse location need to be "unscaled" if hidpi is enabled
// <https://nlguillemot.wordpress.com/2016/12/11/high-dpi-rendering/>
void hidpi_unscale_coordinates(struct hidpi_scale *hidpi_scale, Sint32 *x, Sint32 *y);
#endif

View File

@@ -1,13 +1,37 @@
#include "inputmanager.h"
#include "convert.h"
#include "hidpi.h"
#include "lockutil.h"
#include "log.h"
static struct point get_mouse_point(void) {
int x;
int y;
SDL_GetMouseState(&x, &y);
// Convert window coordinates (as provided by SDL_GetMouseState() to renderer coordinates (as provided in SDL mouse events)
//
// See my question:
// <https://stackoverflow.com/questions/49111054/how-to-get-mouse-position-on-mouse-wheel-event>
static void convert_to_renderer_coordinates(SDL_Renderer *renderer, int *x, int *y) {
SDL_Rect viewport;
float scale_x, scale_y;
SDL_RenderGetViewport(renderer, &viewport);
SDL_RenderGetScale(renderer, &scale_x, &scale_y);
*x = (int) (*x / scale_x) - viewport.x;
*y = (int) (*y / scale_y) - viewport.y;
}
static struct point get_mouse_point(struct screen *screen) {
int mx;
int my;
SDL_GetMouseState(&mx, &my);
convert_to_renderer_coordinates(screen->renderer, &mx, &my);
struct hidpi_scale hidpi_scale;
hidpi_get_scale(screen, &hidpi_scale);
// SDL sometimes uses "int", sometimes "Sint32"
Sint32 x = mx;
Sint32 y = my;
hidpi_unscale_coordinates(&hidpi_scale, &x, &y);
SDL_assert_release(x >= 0 && x < 0x10000 && y >= 0 && y < 0x10000);
return (struct point) {
.x = (Uint16) x,
@@ -178,8 +202,12 @@ void input_manager_process_mouse_motion(struct input_manager *input_manager,
// do not send motion events when no button is pressed
return;
}
struct hidpi_scale hidpi_scale;
hidpi_get_scale(input_manager->screen, &hidpi_scale);
struct control_event control_event;
if (mouse_motion_from_sdl_to_android(event, input_manager->screen->frame_size, &control_event)) {
if (mouse_motion_from_sdl_to_android(event, input_manager->screen->frame_size, &hidpi_scale, &control_event)) {
if (!controller_push_event(input_manager->controller, &control_event)) {
LOGW("Cannot send mouse motion event");
}
@@ -192,8 +220,12 @@ void input_manager_process_mouse_button(struct input_manager *input_manager,
turn_screen_on(input_manager->controller);
return;
};
struct hidpi_scale hidpi_scale;
hidpi_get_scale(input_manager->screen, &hidpi_scale);
struct control_event control_event;
if (mouse_button_from_sdl_to_android(event, input_manager->screen->frame_size, &control_event)) {
if (mouse_button_from_sdl_to_android(event, input_manager->screen->frame_size, &hidpi_scale, &control_event)) {
if (!controller_push_event(input_manager->controller, &control_event)) {
LOGW("Cannot send mouse button event");
}
@@ -204,7 +236,7 @@ void input_manager_process_mouse_wheel(struct input_manager *input_manager,
const SDL_MouseWheelEvent *event) {
struct position position = {
.screen_size = input_manager->screen->frame_size,
.point = get_mouse_point(),
.point = get_mouse_point(input_manager->screen),
};
struct control_event control_event;
if (mouse_wheel_from_sdl_to_android(event, position, &control_event)) {

View File

@@ -71,6 +71,7 @@ static socket_t listen_on_port(Uint16 port) {
static void close_socket(socket_t *socket) {
SDL_assert(*socket != INVALID_SOCKET);
net_shutdown(*socket, SHUT_RDWR);
if (!net_close(*socket)) {
LOGW("Cannot close socket");
return;