Compare commits

..

35 Commits

Author SHA1 Message Date
Romain Vimont
e93fd94842 demuxer-stream-id 2023-02-09 21:59:56 +01:00
Romain Vimont
e131b9c667 audio-streamer 2023-02-09 21:53:46 +01:00
Romain Vimont
e1a009de5d AudioEncoder WIP 2023-02-09 21:53:46 +01:00
Romain Vimont
7d4aedc29a audio demuxer 2023-02-09 21:53:46 +01:00
Romain Vimont
fa1178195e Rename demuxer to video_demuxer
There will be another demuxer instance for audio.
2023-02-07 23:12:54 +01:00
Simon Chan
5dcce5f241 Capture device audio WIP
Co-authored-by: Romain Vimont <rom@rom1v.com>
Signed-off-by: Romain Vimont <rom@rom1v.com>
2023-02-07 23:12:54 +01:00
Simon Chan
3f58dd22dc socketwip
Co-authored-by: Romain Vimont <rom@rom1v.com>
Signed-off-by: Romain Vimont <rom@rom1v.com>
2023-02-07 23:12:54 +01:00
Simon Chan
cbd75ff1fb audio option
Co-authored-by: Romain Vimont <rom@rom1v.com>
Signed-off-by: Romain Vimont <rom@rom1v.com>
2023-02-07 23:12:54 +01:00
Romain Vimont
39cdf68150 Use FakeContext for Application instance
This will expose the correct package name and UID to the application
context.
2023-02-07 23:12:54 +01:00
Romain Vimont
3492c5c963 Use shell package name for workarounds
For consistency.
2023-02-07 23:12:54 +01:00
Romain Vimont
35f4040a78 Use PACKAGE_NAME from FakeContext
Remove duplicated constant.
2023-02-07 23:12:54 +01:00
Romain Vimont
fd7d6f3822 Use AttributionSource from FakeContext
FakeContext already provides an AttributeSource instance.

Co-authored-by: Simon Chan <1330321+yume-chan@users.noreply.github.com>
2023-02-07 23:12:54 +01:00
Simon Chan
6c3e7f6b45 Add a fake Android Context
Since scrcpy-server is not an Android application (it's a java
executable), it has no Context.

Some features will require a Context instance to get the package name
and the UID. Add a FakeContext for this purpose.

Co-authored-by: Romain Vimont <rom@rom1v.com>
Signed-off-by: Romain Vimont <rom@rom1v.com>
2023-02-07 23:12:54 +01:00
Romain Vimont
94d90927ad Use Process.ROOT_UID
Replace ServiceManager.USER_ID by existing constant Process.ROOT_UID.
2023-02-07 23:12:54 +01:00
Romain Vimont
93504f5f34 Make streamer independent of codec type
Rename VideoStreamer to Streamer, and extract a Codec interface which
will also support audio codecs.
2023-02-07 23:12:54 +01:00
Romain Vimont
a52b1f025d Pass all args to ScreenEncoder constructor
There is no good reason to pass some of them in the constructor and some
others as parameters of the streamScreen() method.
2023-02-07 23:12:54 +01:00
Romain Vimont
46759d3f63 Move screen encoder initialization
This prepares further refactors.
2023-02-07 23:12:54 +01:00
Romain Vimont
e37f9a4d7c Write streamer header from ScreenEncoder
The screen encoder is responsible to write data to the video streamer
from its own thread (which will be added later).
2023-02-07 23:12:54 +01:00
Romain Vimont
842fdccaa9 Use VideoStreamer directly from ScreenEncoder
The Callbacks interface notifies new packets. In addition, the screen
encoder will need to write headers on start from its own thread (which
will be added later).

We could add a function onStart() instead, but for simplicity, just
remove the interface, which brings no value, and call the streamer
directly.

Refs 87972e2022
2023-02-07 23:12:54 +01:00
Romain Vimont
8cdae30b01 Simplify error handling on socket creation
On any error, all previously closed sockets must be closed.

Handle these errors in a single catch-block. Currently, there are only 2
sockets, but this will simplify even more with more sockets.

Note: this commit is better displayed with --ignore-space-change (-b).
2023-02-07 23:12:54 +01:00
Romain Vimont
30b8d140e8 Log component stopped in finally clause
The message must be logged even when no exception occurs.
2023-02-07 23:12:54 +01:00
Romain Vimont
4f986d4bbb Join all threads before end of main
Some calls run from separate threads may throw exceptions when the
main() method has returned.
2023-02-07 23:12:51 +01:00
Romain Vimont
6524e90c68 Remove unused constant
This line was committed by error.

Refs 3e517cd40e
2023-02-07 23:11:42 +01:00
Romain Vimont
f2dee20a20 Set power mode on all physical displays
Android 10 and above support multiple physical displays. Apply power
mode to all of them.

Fixes #3716 <https://github.com/Genymobile/scrcpy/issues/3716>
2023-02-06 11:07:14 +01:00
Romain Vimont
d2dce51038 Add support for AV1
Add option --codec=av1.

PR #3713 <https://github.com/Genymobile/scrcpy/pull/3713>
2023-02-06 11:00:49 +01:00
Romain Vimont
4342c5637d Add support for H265
Add option --codec=h265.

PR #3713 <https://github.com/Genymobile/scrcpy/pull/3713>
Fixes #3092 <https://github.com/Genymobile/scrcpy/issues/3092>
2023-02-06 11:00:49 +01:00
Romain Vimont
3e517cd40e Add option to select video codec
Introduce the selection mechanism. Alternative codecs will be added in
further commits.

PR #3713 <https://github.com/Genymobile/scrcpy/pull/3713>
2023-02-06 10:58:45 +01:00
Romain Vimont
f70f6cdd3e Simplify server info initialization
Use sc_read16be() to read 16-bit integer fields.
2023-02-03 12:31:28 +01:00
Romain Vimont
87972e2022 Extract video streaming to a separate class
ScreenEncoder handled both capture/encoding and sending over the
network.

Move the streaming part to a separate VideoStreamer.
2023-02-03 12:31:28 +01:00
Romain Vimont
3aac74e9e9 Move variable assignment
Computing eof flag is not necessary if rotation changed.
2023-02-03 12:31:28 +01:00
Romain Vimont
1c82c3923d Compute relative PTS on the client-side
The PTS received from MediaCodec are expressed relative to an arbitrary
clock origin. We consider the PTS of the first frame to be 0, and the
PTS of every other frame is relative to this first PTS (note that the
PTS is only used for recording, it is ignored for mirroring).

For simplicity, this relative PTS was computed on the server-side.

To prepare support for multiple stream (video and audio), send the
packet with its original PTS, and handle the PTS offset on the
client-side (by the recorder).

Since we can't know in advance which stream will produce the first
packet with the lowest PTS (a packet received later on one stream may
have a PTS lower than a packet received earlier on another stream),
computing the PTS on the server-side would require unnecessary waiting.
2023-02-03 12:31:28 +01:00
Romain Vimont
36d656e91f Improve workarounds call comments 2023-02-03 12:31:28 +01:00
Romain Vimont
bdbf1f4eb7 Move Workarounds call
Workarounds are not specific to the screen encoder.

Co-authored-by: Simon Chan <1330321+yume-chan@users.noreply.github.com>
2023-02-03 12:31:28 +01:00
Romain Vimont
4177de5880 Do not expose controller threads
The way the controller executes its events asynchronously is an
implementation detail.
2023-02-03 12:31:28 +01:00
Romain Vimont
6a07e3d470 Fix manpage formatting
Only the option arguments must be underlined.
2023-02-03 12:31:28 +01:00
26 changed files with 1036 additions and 164 deletions

View File

@@ -252,10 +252,22 @@ This affects recording orientation.
The [window may also be rotated](#rotation) independently.
#### Encoder
#### Codec
Some devices have more than one encoder, and some of them may cause issues or
crash. It is possible to select a different encoder:
The video codec can be selected. The possible values are `h264` (default),
`h265` and `av1`:
```bash
scrcpy --codec=h264 # default
scrcpy --codec=h265
scrcpy --codec=av1
```
##### Encoder
Some devices have more than one encoder for a specific codec, and some of them
may cause issues or crash. It is possible to select a different encoder:
```bash
scrcpy --encoder=OMX.qcom.video.encoder.avc
@@ -265,7 +277,8 @@ To list the available encoders, you can pass an invalid encoder name; the
error will give the available encoders:
```bash
scrcpy --encoder=_
scrcpy --encoder=_ # for the default codec
scrcpy --codec=h265 --encoder=_ # for a specific codec
```
### Capture

View File

@@ -26,7 +26,11 @@ Encode the video at the given bit\-rate, expressed in bits/s. Unit suffixes are
Default is 8000000.
.TP
.BI "\-\-codec\-options " key[:type]=value[,...]
.BI "\-\-codec " name
Select a video codec (h264, h265 or av1).
.TP
.BI "\-\-codec\-options " key\fR[:\fItype\fR]=\fIvalue\fR[,...]
Set a list of comma-separated key:type=value options for the device encoder.
The possible values for 'type' are 'int' (default), 'long', 'float' and 'string'.
@@ -117,7 +121,7 @@ Inject computer clipboard text as a sequence of key events on Ctrl+v (like MOD+S
This is a workaround for some devices not behaving as expected when setting the device clipboard programmatically.
.TP
.BI "\-\-lock\-video\-orientation[=value]
\fB\-\-lock\-video\-orientation\fR[=\fIvalue\fR]
Lock video orientation to \fIvalue\fR. Possible values are "unlocked", "initial" (locked to the initial orientation), 0, 1, 2 and 3. Natural device orientation is 0, and each increment adds a 90 degrees rotation counterclockwise.
Default is "unlocked".
@@ -199,7 +203,7 @@ It may only work over USB.
See \fB\-\-hid\-keyboard\fR and \fB\-\-hid\-mouse\fR.
.TP
.BI "\-p, \-\-port " port[:port]
.BI "\-p, \-\-port " port\fR[:\fIport\fR]
Set the TCP port (range) used by the client to listen.
Default is 27183:27199.
@@ -260,7 +264,7 @@ Set the initial display rotation. Possibles values are 0, 1, 2 and 3. Each incre
The device serial number. Mandatory only if several devices are connected to adb.
.TP
.BI "\-\-shortcut\-mod " key[+...]][,...]
.BI "\-\-shortcut\-mod " key\fR[+...]][,...]
Specify the modifiers to use for scrcpy shortcuts. Possible keys are "lctrl", "rctrl", "lalt", "ralt", "lsuper" and "rsuper".
A shortcut can consist in several keys, separated by '+'. Several shortcuts can be specified, separated by ','.
@@ -270,7 +274,7 @@ For example, to use either LCtrl+LAlt or LSuper for scrcpy shortcuts, pass "lctr
Default is "lalt,lsuper" (left-Alt or left-Super).
.TP
.BI "\-\-tcpip[=ip[:port]]
.BI "\-\-tcpip\fR[=\fIip\fR[:\fIport\fR]]
Configure and reconnect the device over TCP/IP.
If a destination address is provided, then scrcpy connects to this address before starting. The device must listen on the given TCP port (default is 5555).

View File

@@ -57,6 +57,8 @@
#define OPT_NO_CLEANUP 1037
#define OPT_PRINT_FPS 1038
#define OPT_NO_POWER_ON 1039
#define OPT_CODEC 1040
#define OPT_NO_AUDIO 1041
struct sc_option {
char shortopt;
@@ -105,6 +107,12 @@ static const struct sc_option options[] = {
"Unit suffixes are supported: 'K' (x1000) and 'M' (x1000000).\n"
"Default is " STR(DEFAULT_BIT_RATE) ".",
},
{
.longopt_id = OPT_CODEC,
.longopt = "codec",
.argdesc = "name",
.text = "Select a video codec (h264, h265 or av1).",
},
{
.longopt_id = OPT_CODEC_OPTIONS,
.longopt = "codec-options",
@@ -291,6 +299,11 @@ static const struct sc_option options[] = {
.text = "Do not display device (only when screen recording or V4L2 "
"sink is enabled).",
},
{
.longopt_id = OPT_NO_AUDIO,
.longopt = "no-audio",
.text = "Disable audio forwarding.",
},
{
.longopt_id = OPT_NO_KEY_REPEAT,
.longopt = "no-key-repeat",
@@ -1377,6 +1390,24 @@ guess_record_format(const char *filename) {
return 0;
}
static bool
parse_codec(const char *optarg, enum sc_codec *codec) {
if (!strcmp(optarg, "h264")) {
*codec = SC_CODEC_H264;
return true;
}
if (!strcmp(optarg, "h265")) {
*codec = SC_CODEC_H265;
return true;
}
if (!strcmp(optarg, "av1")) {
*codec = SC_CODEC_AV1;
return true;
}
LOGE("Unsupported codec: %s (expected h264, h265 or av1)", optarg);
return false;
}
static bool
parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
const char *optstring, const struct option *longopts) {
@@ -1601,6 +1632,9 @@ parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
case OPT_NO_DOWNSIZE_ON_ERROR:
opts->downsize_on_error = false;
break;
case OPT_NO_AUDIO:
opts->audio = false;
break;
case OPT_NO_CLEANUP:
opts->cleanup = false;
break;
@@ -1610,6 +1644,11 @@ parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
case OPT_PRINT_FPS:
opts->start_fps_counter = true;
break;
case OPT_CODEC:
if (!parse_codec(optarg, &opts->codec)) {
return false;
}
break;
case OPT_OTG:
#ifdef HAVE_USB
opts->otg = true;
@@ -1718,6 +1757,13 @@ parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
}
}
if (opts->record_format == SC_RECORD_FORMAT_MP4
&& opts->codec == SC_CODEC_AV1) {
LOGE("Could not mux AV1 stream into MP4 container "
"(record to mkv or select another video codec)");
return false;
}
if (!opts->control) {
if (opts->turn_screen_off) {
LOGE("Could not request to turn screen off if control is disabled");

View File

@@ -17,6 +17,34 @@
#define SC_PACKET_PTS_MASK (SC_PACKET_FLAG_KEY_FRAME - 1)
static enum AVCodecID
sc_demuxer_recv_codec_id(struct sc_demuxer *demuxer) {
uint8_t data[4];
ssize_t r = net_recv_all(demuxer->socket, data, 4);
if (r < 4) {
return false;
}
#define SC_CODEC_ID_H264 UINT32_C(0x68323634) // "h264" in ASCII
#define SC_CODEC_ID_H265 UINT32_C(0x68323635) // "h265" in ASCII
#define SC_CODEC_ID_AV1 UINT32_C(0x00617631) // "av1" in ASCII
#define SC_CODEC_ID_OPUS UINT32_C(0x6f707573) // "opus" in ASCII
uint32_t codec_id = sc_read32be(data);
switch (codec_id) {
case SC_CODEC_ID_H264:
return AV_CODEC_ID_H264;
case SC_CODEC_ID_H265:
return AV_CODEC_ID_HEVC;
case SC_CODEC_ID_AV1:
return AV_CODEC_ID_AV1;
case SC_CODEC_ID_OPUS:
return AV_CODEC_ID_OPUS;
default:
LOGE("Unknown codec id 0x%08" PRIx32, codec_id);
return AV_CODEC_ID_NONE;
}
}
static bool
sc_demuxer_recv_packet(struct sc_demuxer *demuxer, AVPacket *packet) {
// The video stream contains raw packets, without time information. When we
@@ -171,7 +199,13 @@ static int
run_demuxer(void *data) {
struct sc_demuxer *demuxer = data;
const AVCodec *codec = avcodec_find_decoder(AV_CODEC_ID_H264);
enum AVCodecID codec_id = sc_demuxer_recv_codec_id(demuxer);
if (codec_id == AV_CODEC_ID_NONE) {
// Error already logged
goto end;
}
const AVCodec *codec = avcodec_find_decoder(codec_id);
if (!codec) {
LOGE("H.264 decoder not found");
goto end;
@@ -188,7 +222,7 @@ run_demuxer(void *data) {
goto finally_free_codec_ctx;
}
demuxer->parser = av_parser_init(AV_CODEC_ID_H264);
demuxer->parser = av_parser_init(codec_id);
if (!demuxer->parser) {
LOGE("Could not initialize parser");
goto finally_close_sinks;
@@ -239,8 +273,12 @@ end:
}
void
sc_demuxer_init(struct sc_demuxer *demuxer, sc_socket socket,
const struct sc_demuxer_callbacks *cbs, void *cbs_userdata) {
sc_demuxer_init(struct sc_demuxer *demuxer, enum sc_stream_id stream_id,
sc_socket socket, const struct sc_demuxer_callbacks *cbs,
void *cbs_userdata) {
assert(socket != SC_SOCKET_NONE);
demuxer->stream_id = stream_id;
demuxer->socket = socket;
demuxer->pending = NULL;
demuxer->sink_count = 0;

View File

@@ -14,7 +14,13 @@
#define SC_DEMUXER_MAX_SINKS 2
enum sc_stream_id {
SC_STREAM_ID_VIDEO,
SC_STREAM_ID_AUDIO,
};
struct sc_demuxer {
enum sc_stream_id stream_id;
sc_socket socket;
sc_thread thread;
@@ -36,8 +42,9 @@ struct sc_demuxer_callbacks {
};
void
sc_demuxer_init(struct sc_demuxer *demuxer, sc_socket socket,
const struct sc_demuxer_callbacks *cbs, void *cbs_userdata);
sc_demuxer_init(struct sc_demuxer *demuxer, enum sc_stream_id stream_id,
sc_socket socket, const struct sc_demuxer_callbacks *cbs,
void *cbs_userdata);
void
sc_demuxer_add_sink(struct sc_demuxer *demuxer, struct sc_packet_sink *sink);

View File

@@ -13,6 +13,7 @@ const struct scrcpy_options scrcpy_options_default = {
.v4l2_device = NULL,
#endif
.log_level = SC_LOG_LEVEL_INFO,
.codec = SC_CODEC_H264,
.record_format = SC_RECORD_FORMAT_AUTO,
.keyboard_input_mode = SC_KEYBOARD_INPUT_MODE_INJECT,
.port_range = {
@@ -65,4 +66,5 @@ const struct scrcpy_options scrcpy_options_default = {
.cleanup = true,
.start_fps_counter = false,
.power_on = true,
.audio = true,
};

View File

@@ -23,6 +23,12 @@ enum sc_record_format {
SC_RECORD_FORMAT_MKV,
};
enum sc_codec {
SC_CODEC_H264,
SC_CODEC_H265,
SC_CODEC_AV1,
};
enum sc_lock_video_orientation {
SC_LOCK_VIDEO_ORIENTATION_UNLOCKED = -1,
// lock the current orientation when scrcpy starts
@@ -93,6 +99,7 @@ struct scrcpy_options {
const char *v4l2_device;
#endif
enum sc_log_level log_level;
enum sc_codec codec;
enum sc_record_format record_format;
enum sc_keyboard_input_mode keyboard_input_mode;
enum sc_mouse_input_mode mouse_input_mode;
@@ -140,6 +147,7 @@ struct scrcpy_options {
bool cleanup;
bool start_fps_counter;
bool power_on;
bool audio;
};
extern const struct scrcpy_options scrcpy_options_default;

View File

@@ -11,6 +11,8 @@
/** Downcast packet_sink to recorder */
#define DOWNCAST(SINK) container_of(SINK, struct sc_recorder, packet_sink)
#define SC_PTS_ORIGIN_NONE UINT64_C(-1)
static const AVRational SCRCPY_TIME_BASE = {1, 1000000}; // timestamps in us
static const AVOutputFormat *
@@ -169,6 +171,18 @@ run_recorder(void *data) {
sc_mutex_unlock(&recorder->mutex);
if (recorder->pts_origin == SC_PTS_ORIGIN_NONE
&& rec->packet->pts != AV_NOPTS_VALUE) {
// First PTS received
recorder->pts_origin = rec->packet->pts;
}
if (rec->packet->pts != AV_NOPTS_VALUE) {
// Set PTS relatve to the origin
rec->packet->pts -= recorder->pts_origin;
rec->packet->dts = rec->packet->pts;
}
// recorder->previous is only written from this thread, no need to lock
struct sc_record_packet *previous = recorder->previous;
recorder->previous = rec;
@@ -243,6 +257,7 @@ sc_recorder_open(struct sc_recorder *recorder, const AVCodec *input_codec) {
recorder->failed = false;
recorder->header_written = false;
recorder->previous = NULL;
recorder->pts_origin = SC_PTS_ORIGIN_NONE;
const char *format_name = sc_recorder_get_format_name(recorder->format);
assert(format_name);

View File

@@ -28,6 +28,8 @@ struct sc_recorder {
struct sc_size declared_frame_size;
bool header_written;
uint64_t pts_origin;
sc_thread thread;
sc_mutex mutex;
sc_cond queue_cond;

View File

@@ -40,7 +40,8 @@
struct scrcpy {
struct sc_server server;
struct sc_screen screen;
struct sc_demuxer demuxer;
struct sc_demuxer video_demuxer;
struct sc_demuxer audio_demuxer;
struct sc_decoder decoder;
struct sc_recorder recorder;
#ifdef HAVE_V4L2
@@ -233,13 +234,21 @@ av_log_callback(void *avcl, int level, const char *fmt, va_list vl) {
}
static void
sc_demuxer_on_eos(struct sc_demuxer *demuxer, void *userdata) {
sc_video_demuxer_on_eos(struct sc_demuxer *demuxer, void *userdata) {
(void) demuxer;
(void) userdata;
PUSH_EVENT(EVENT_STREAM_STOPPED);
}
static void
sc_audio_demuxer_on_eos(struct sc_demuxer *demuxer, void *userdata) {
(void) demuxer;
(void) userdata;
// TODO
}
static void
sc_server_on_connection_failed(struct sc_server *server, void *userdata) {
(void) server;
@@ -295,7 +304,8 @@ scrcpy(struct scrcpy_options *options) {
#ifdef HAVE_V4L2
bool v4l2_sink_initialized = false;
#endif
bool demuxer_started = false;
bool video_demuxer_started = false;
bool audio_demuxer_started = false;
#ifdef HAVE_USB
bool aoa_hid_initialized = false;
bool hid_keyboard_initialized = false;
@@ -315,6 +325,7 @@ scrcpy(struct scrcpy_options *options) {
.select_usb = options->select_usb,
.select_tcpip = options->select_tcpip,
.log_level = options->log_level,
.codec = options->codec,
.crop = options->crop,
.port_range = options->port_range,
.tunnel_host = options->tunnel_host,
@@ -325,6 +336,7 @@ scrcpy(struct scrcpy_options *options) {
.lock_video_orientation = options->lock_video_orientation,
.control = options->control,
.display_id = options->display_id,
.audio = options->audio,
.show_touches = options->show_touches,
.stay_awake = options->stay_awake,
.codec_options = options->codec_options,
@@ -419,17 +431,26 @@ scrcpy(struct scrcpy_options *options) {
av_log_set_callback(av_log_callback);
static const struct sc_demuxer_callbacks demuxer_cbs = {
.on_eos = sc_demuxer_on_eos,
static const struct sc_demuxer_callbacks video_demuxer_cbs = {
.on_eos = sc_video_demuxer_on_eos,
};
sc_demuxer_init(&s->demuxer, s->server.video_socket, &demuxer_cbs, NULL);
sc_demuxer_init(&s->video_demuxer, SC_STREAM_ID_VIDEO,
s->server.video_socket, &video_demuxer_cbs, NULL);
if (options->audio) {
static const struct sc_demuxer_callbacks audio_demuxer_cbs = {
.on_eos = sc_audio_demuxer_on_eos,
};
sc_demuxer_init(&s->audio_demuxer, SC_STREAM_ID_AUDIO,
s->server.audio_socket, &audio_demuxer_cbs, NULL);
}
if (dec) {
sc_demuxer_add_sink(&s->demuxer, &dec->packet_sink);
sc_demuxer_add_sink(&s->video_demuxer, &dec->packet_sink);
}
if (rec) {
sc_demuxer_add_sink(&s->demuxer, &rec->packet_sink);
sc_demuxer_add_sink(&s->video_demuxer, &rec->packet_sink);
}
struct sc_controller *controller = NULL;
@@ -637,17 +658,24 @@ aoa_hid_end:
#endif
// now we consumed the header values, the socket receives the video stream
// start the demuxer
if (!sc_demuxer_start(&s->demuxer)) {
// start the video demuxer
if (!sc_demuxer_start(&s->video_demuxer)) {
goto end;
}
demuxer_started = true;
video_demuxer_started = true;
if (options->audio) {
if (!sc_demuxer_start(&s->audio_demuxer)) {
goto end;
}
audio_demuxer_started = true;
}
ret = event_loop(s);
LOGD("quit...");
// Close the window immediately on closing, because screen_destroy() may
// only be called once the demuxer thread is joined (it may take time)
// only be called once the video demuxer thread is joined (it may take time)
sc_screen_hide_window(&s->screen);
end:
@@ -685,8 +713,12 @@ end:
// now that the sockets are shutdown, the demuxer and controller are
// interrupted, we can join them
if (demuxer_started) {
sc_demuxer_join(&s->demuxer);
if (video_demuxer_started) {
sc_demuxer_join(&s->video_demuxer);
}
if (audio_demuxer_started) {
sc_demuxer_join(&s->audio_demuxer);
}
#ifdef HAVE_V4L2
@@ -705,8 +737,9 @@ end:
}
#endif
// Destroy the screen only after the demuxer is guaranteed to be finished,
// because otherwise the screen could receive new frames after destruction
// Destroy the screen only after the video demuxer is guaranteed to be
// finished, because otherwise the screen could receive new frames after
// destruction
if (screen_initialized) {
sc_screen_join(&s->screen);
sc_screen_destroy(&s->screen);

View File

@@ -8,6 +8,7 @@
#include <SDL2/SDL_platform.h>
#include "adb/adb.h"
#include "util/binary.h"
#include "util/file.h"
#include "util/log.h"
#include "util/net_intr.h"
@@ -155,6 +156,20 @@ sc_server_sleep(struct sc_server *server, sc_tick deadline) {
return !stopped;
}
static const char *
sc_server_get_codec_name(enum sc_codec codec) {
switch (codec) {
case SC_CODEC_H264:
return "h264";
case SC_CODEC_H265:
return "h265";
case SC_CODEC_AV1:
return "av1";
default:
return NULL;
}
}
static sc_pid
execute_server(struct sc_server *server,
const struct sc_server_params *params) {
@@ -202,6 +217,12 @@ execute_server(struct sc_server *server,
ADD_PARAM("log_level=%s", log_level_to_server_string(params->log_level));
ADD_PARAM("bit_rate=%" PRIu32, params->bit_rate);
if (!params->audio) {
ADD_PARAM("audio=false");
}
if (params->codec != SC_CODEC_H264) {
ADD_PARAM("codec=%s", sc_server_get_codec_name(params->codec));
}
if (params->max_size) {
ADD_PARAM("max_size=%" PRIu16, params->max_size);
}
@@ -370,6 +391,7 @@ sc_server_init(struct sc_server *server, const struct sc_server_params *params,
server->stopped = false;
server->video_socket = SC_SOCKET_NONE;
server->audio_socket = SC_SOCKET_NONE;
server->control_socket = SC_SOCKET_NONE;
sc_adb_tunnel_init(&server->tunnel);
@@ -398,10 +420,9 @@ device_read_info(struct sc_intr *intr, sc_socket device_socket,
buf[SC_DEVICE_NAME_FIELD_LENGTH - 1] = '\0';
memcpy(info->device_name, (char *) buf, sizeof(info->device_name));
info->frame_size.width = (buf[SC_DEVICE_NAME_FIELD_LENGTH] << 8)
| buf[SC_DEVICE_NAME_FIELD_LENGTH + 1];
info->frame_size.height = (buf[SC_DEVICE_NAME_FIELD_LENGTH + 2] << 8)
| buf[SC_DEVICE_NAME_FIELD_LENGTH + 3];
unsigned char *fields = &buf[SC_DEVICE_NAME_FIELD_LENGTH];
info->frame_size.width = sc_read16be(fields);
info->frame_size.height = sc_read16be(&fields[2]);
return true;
}
@@ -414,9 +435,11 @@ sc_server_connect_to(struct sc_server *server, struct sc_server_info *info) {
const char *serial = server->serial;
assert(serial);
bool audio = server->params.audio;
bool control = server->params.control;
sc_socket video_socket = SC_SOCKET_NONE;
sc_socket audio_socket = SC_SOCKET_NONE;
sc_socket control_socket = SC_SOCKET_NONE;
if (!tunnel->forward) {
video_socket = net_accept_intr(&server->intr, tunnel->server_socket);
@@ -424,6 +447,14 @@ sc_server_connect_to(struct sc_server *server, struct sc_server_info *info) {
goto fail;
}
if (audio) {
audio_socket =
net_accept_intr(&server->intr, tunnel->server_socket);
if (audio_socket == SC_SOCKET_NONE) {
goto fail;
}
}
if (control) {
control_socket =
net_accept_intr(&server->intr, tunnel->server_socket);
@@ -450,6 +481,18 @@ sc_server_connect_to(struct sc_server *server, struct sc_server_info *info) {
goto fail;
}
if (audio) {
audio_socket = net_socket();
if (audio_socket == SC_SOCKET_NONE) {
goto fail;
}
bool ok = net_connect_intr(&server->intr, audio_socket, tunnel_host,
tunnel_port);
if (!ok) {
goto fail;
}
}
if (control) {
// we know that the device is listening, we don't need several
// attempts
@@ -476,9 +519,11 @@ sc_server_connect_to(struct sc_server *server, struct sc_server_info *info) {
}
assert(video_socket != SC_SOCKET_NONE);
assert(!audio || audio_socket != SC_SOCKET_NONE);
assert(!control || control_socket != SC_SOCKET_NONE);
server->video_socket = video_socket;
server->audio_socket = audio_socket;
server->control_socket = control_socket;
return true;
@@ -490,6 +535,12 @@ fail:
}
}
if (audio_socket != SC_SOCKET_NONE) {
if (!net_close(audio_socket)) {
LOGW("Could not close audio socket");
}
}
if (control_socket != SC_SOCKET_NONE) {
if (!net_close(control_socket)) {
LOGW("Could not close control socket");
@@ -835,6 +886,11 @@ run_server(void *data) {
assert(server->video_socket != SC_SOCKET_NONE);
net_interrupt(server->video_socket);
if (server->audio_socket != SC_SOCKET_NONE) {
// There is no audio_socket if --no-audio is set
net_interrupt(server->audio_socket);
}
if (server->control_socket != SC_SOCKET_NONE) {
// There is no control_socket if --no-control is set
net_interrupt(server->control_socket);
@@ -896,6 +952,9 @@ sc_server_destroy(struct sc_server *server) {
if (server->video_socket != SC_SOCKET_NONE) {
net_close(server->video_socket);
}
if (server->audio_socket != SC_SOCKET_NONE) {
net_close(server->audio_socket);
}
if (server->control_socket != SC_SOCKET_NONE) {
net_close(server->control_socket);
}

View File

@@ -25,6 +25,7 @@ struct sc_server_params {
uint32_t uid;
const char *req_serial;
enum sc_log_level log_level;
enum sc_codec codec;
const char *crop;
const char *codec_options;
const char *encoder_name;
@@ -37,6 +38,7 @@ struct sc_server_params {
int8_t lock_video_orientation;
bool control;
uint32_t display_id;
bool audio;
bool show_touches;
bool stay_awake;
bool force_adb_forward;
@@ -68,6 +70,7 @@ struct sc_server {
struct sc_adb_tunnel tunnel;
sc_socket video_socket;
sc_socket audio_socket;
sc_socket control_socket;
const struct sc_server_callbacks *cbs;

View File

@@ -0,0 +1,46 @@
package com.genymobile.scrcpy;
import android.media.MediaFormat;
public enum AudioCodec implements Codec {
OPUS(0x6f_70_75_73, "opus", MediaFormat.MIMETYPE_AUDIO_OPUS);
private final int id; // 4-byte ASCII representation of the name
private final String name;
private final String mimeType;
AudioCodec(int id, String name, String mimeType) {
this.id = id;
this.name = name;
this.mimeType = mimeType;
}
@Override
public Type getType() {
return Type.VIDEO;
}
@Override
public int getId() {
return id;
}
@Override
public String getName() {
return name;
}
@Override
public String getMimeType() {
return mimeType;
}
public static AudioCodec findByName(String name) {
for (AudioCodec codec : values()) {
if (codec.name.equals(name)) {
return codec;
}
}
return null;
}
}

View File

@@ -0,0 +1,308 @@
package com.genymobile.scrcpy;
import android.annotation.SuppressLint;
import android.annotation.TargetApi;
import android.media.AudioFormat;
import android.media.AudioRecord;
import android.media.AudioTimestamp;
import android.media.MediaCodec;
import android.media.MediaFormat;
import android.media.MediaRecorder;
import android.os.Build;
import android.os.Handler;
import android.os.HandlerThread;
import android.os.SystemClock;
import java.io.IOException;
import java.nio.ByteBuffer;
import java.util.concurrent.ArrayBlockingQueue;
import java.util.concurrent.BlockingQueue;
import java.util.concurrent.atomic.AtomicBoolean;
public final class AudioEncoder {
private static class InputTask {
final int index;
InputTask(int index) {
this.index = index;
}
}
private static class OutputTask {
final int index;
final MediaCodec.BufferInfo bufferInfo;
OutputTask(int index, MediaCodec.BufferInfo bufferInfo) {
this.index = index;
this.bufferInfo = bufferInfo;
}
}
private static final String MIMETYPE = MediaFormat.MIMETYPE_AUDIO_OPUS;
private static final int SAMPLE_RATE = 48000;
private static final int CHANNELS = 2;
private static final int BIT_RATE = 128000;
private static int BUFFER_MS = 15; // milliseconds
private static final int BUFFER_SIZE = SAMPLE_RATE * CHANNELS * BUFFER_MS / 1000;
private final Streamer streamer;
private AudioRecord recorder;
private MediaCodec mediaCodec;
// Capacity of 64 is in practice "infinite" (it is limited by the number of available MediaCodec buffers, typically 4).
// So many pending tasks would lead to an unacceptable delay anyway.
private final BlockingQueue<InputTask> inputTasks = new ArrayBlockingQueue<>(64);
private final BlockingQueue<OutputTask> outputTasks = new ArrayBlockingQueue<>(64);
private Thread thread;
private HandlerThread mediaCodecThread;
private Thread inputThread;
private Thread outputThread;
private boolean ended;
public AudioEncoder(Streamer streamer) {
this.streamer = streamer;
}
private static AudioFormat createAudioFormat() {
AudioFormat.Builder builder = new AudioFormat.Builder();
builder.setEncoding(AudioFormat.ENCODING_PCM_16BIT);
builder.setSampleRate(SAMPLE_RATE);
builder.setChannelMask(CHANNELS == 2 ? AudioFormat.CHANNEL_IN_STEREO : AudioFormat.CHANNEL_IN_MONO);
return builder.build();
}
@TargetApi(Build.VERSION_CODES.M)
@SuppressLint({"WrongConstant", "MissingPermission"})
private static AudioRecord createAudioRecord() {
AudioRecord.Builder builder = new AudioRecord.Builder();
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.S) {
// On older APIs, Workarounds.fillAppInfo() must be called beforehand
builder.setContext(FakeContext.get());
}
builder.setAudioSource(MediaRecorder.AudioSource.REMOTE_SUBMIX);
builder.setAudioFormat(createAudioFormat());
builder.setBufferSizeInBytes(1024 * 1024);
return builder.build();
}
private static MediaFormat createFormat() {
MediaFormat format = new MediaFormat();
format.setString(MediaFormat.KEY_MIME, MIMETYPE);
format.setInteger(MediaFormat.KEY_BIT_RATE, BIT_RATE);
format.setInteger(MediaFormat.KEY_CHANNEL_COUNT, CHANNELS);
format.setInteger(MediaFormat.KEY_SAMPLE_RATE, SAMPLE_RATE);
return format;
}
@TargetApi(Build.VERSION_CODES.N)
private void inputThread() throws IOException, InterruptedException {
final AudioTimestamp timestamp = new AudioTimestamp();
long previousPts = 0;
long nextPts = 0;
while (!Thread.currentThread().isInterrupted()) {
InputTask task = inputTasks.take();
ByteBuffer buffer = mediaCodec.getInputBuffer(task.index);
int r = recorder.read(buffer, BUFFER_SIZE);
if (r < 0) {
throw new IOException("Could not read audio: " + r);
}
long pts;
int ret = recorder.getTimestamp(timestamp, AudioTimestamp.TIMEBASE_MONOTONIC);
if (ret == AudioRecord.SUCCESS) {
pts = timestamp.nanoTime / 1000;
} else {
if (nextPts == 0) {
Ln.w("Could not get any audio timestamp");
}
// compute from previous timestamp and packet size
pts = nextPts;
}
long durationMs = r * 1000 / CHANNELS / SAMPLE_RATE;
nextPts = pts + durationMs;
if (previousPts != 0 && pts < previousPts) {
// Audio PTS may come from two sources:
// - recorder.getTimestamp() if the call works;
// - an estimation from the previous PTS and the packet size as a fallback.
//
// Therefore, the property that PTS are monotonically increasing is no guaranteed in corner cases, so enforce it.
pts = previousPts + 1;
}
mediaCodec.queueInputBuffer(task.index, 0, r, pts, 0);
previousPts = pts;
}
}
private void outputThread() throws IOException, InterruptedException {
streamer.writeHeader();
while (!Thread.currentThread().isInterrupted()) {
OutputTask task = outputTasks.take();
ByteBuffer buffer = mediaCodec.getOutputBuffer(task.index);
try {
streamer.writePacket(buffer, task.bufferInfo);
} finally {
mediaCodec.releaseOutputBuffer(task.index, false);
}
}
}
public void start() {
thread = new Thread(() -> {
try {
encode();
} catch (IOException e) {
// this is expected on close
} finally {
Ln.d("Audio encoder stopped");
}
});
thread.start();
}
public void stop() {
if (thread != null) {
// Just wake up the blocking wait from the thread, so that it properly releases all its resources and terminates
end();
}
}
public void join() throws InterruptedException {
if (thread != null) {
thread.join();
}
}
private synchronized void end() {
ended = true;
notify();
}
private synchronized void waitEnded() {
try {
while (!ended) {
wait();
}
} catch (InterruptedException e) {
// ignore
}
}
@TargetApi(Build.VERSION_CODES.M)
public void encode() throws IOException {
mediaCodec = MediaCodec.createEncoderByType(MIMETYPE); // may throw IOException
try {
recorder = createAudioRecord();
mediaCodecThread = new HandlerThread("AudioEncoder");
mediaCodecThread.start();
MediaFormat format = createFormat();
mediaCodec.setCallback(new EncoderCallback(), new Handler(mediaCodecThread.getLooper()));
mediaCodec.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
recorder.startRecording();
inputThread = new Thread(() -> {
try {
inputThread();
} catch (IOException | InterruptedException e) {
// this is expected on close
} finally {
end();
}
});
outputThread = new Thread(() -> {
try {
outputThread();
} catch (IOException | InterruptedException e) {
// this is expected on close
} finally {
end();
}
});
mediaCodec.start();
inputThread.start();
outputThread.start();
} catch (Throwable e) {
mediaCodec.release();
if (recorder != null) {
recorder.release();
}
throw e;
}
try {
waitEnded();
} finally {
cleanUp();
}
}
private void cleanUp() {
mediaCodecThread.getLooper().quit();
inputThread.interrupt();
outputThread.interrupt();
try {
mediaCodecThread.join();
inputThread.join();
outputThread.join();
} catch (InterruptedException e) {
// Should never happen
throw new AssertionError(e);
}
mediaCodec.stop();
mediaCodec.release();
recorder.stop();
recorder.release();
}
private class EncoderCallback extends MediaCodec.Callback {
@TargetApi(Build.VERSION_CODES.N)
@Override
public void onInputBufferAvailable(MediaCodec codec, int index) {
try {
inputTasks.put(new InputTask(index));
} catch (InterruptedException e) {
end();
}
}
@Override
public void onOutputBufferAvailable(MediaCodec codec, int index, MediaCodec.BufferInfo bufferInfo) {
try {
outputTasks.put(new OutputTask(index, bufferInfo));
} catch (InterruptedException e) {
end();
}
}
@Override
public void onError(MediaCodec codec, MediaCodec.CodecException e) {
Ln.e("MediaCodec error", e);
end();
}
@Override
public void onOutputFormatChanged(MediaCodec codec, MediaFormat format) {
// ignore
}
}
}

View File

@@ -0,0 +1,11 @@
package com.genymobile.scrcpy;
public interface Codec {
enum Type {VIDEO}
Type getType();
int getId();
String getName();
String getMimeType();
}

View File

@@ -24,6 +24,8 @@ public class Controller {
private static final ScheduledExecutorService EXECUTOR = Executors.newSingleThreadScheduledExecutor();
private Thread thread;
private final Device device;
private final DesktopConnection connection;
private final DeviceMessageSender sender;
@@ -62,7 +64,7 @@ public class Controller {
}
}
public void control() throws IOException {
private void control() throws IOException {
// on start, power on the device
if (powerOn && !Device.isScreenOn()) {
device.pressReleaseKeycode(KeyEvent.KEYCODE_POWER, Device.INJECT_MODE_ASYNC);
@@ -82,6 +84,34 @@ public class Controller {
}
}
public void start() {
thread = new Thread(() -> {
try {
control();
} catch (IOException e) {
// this is expected on close
} finally {
Ln.d("Controller stopped");
}
});
thread.start();
sender.start();
}
public void stop() {
if (thread != null) {
thread.interrupt();
}
sender.stop();
}
public void join() throws InterruptedException {
if (thread != null) {
thread.join();
}
sender.join();
}
public DeviceMessageSender getSender() {
return sender;
}

View File

@@ -20,6 +20,9 @@ public final class DesktopConnection implements Closeable {
private final LocalSocket videoSocket;
private final FileDescriptor videoFd;
private final LocalSocket audioSocket;
private final FileDescriptor audioFd;
private final LocalSocket controlSocket;
private final InputStream controlInputStream;
private final OutputStream controlOutputStream;
@@ -27,9 +30,10 @@ public final class DesktopConnection implements Closeable {
private final ControlMessageReader reader = new ControlMessageReader();
private final DeviceMessageWriter writer = new DeviceMessageWriter();
private DesktopConnection(LocalSocket videoSocket, LocalSocket controlSocket) throws IOException {
private DesktopConnection(LocalSocket videoSocket, LocalSocket audioSocket, LocalSocket controlSocket) throws IOException {
this.videoSocket = videoSocket;
this.controlSocket = controlSocket;
this.audioSocket = audioSocket;
if (controlSocket != null) {
controlInputStream = controlSocket.getInputStream();
controlOutputStream = controlSocket.getOutputStream();
@@ -38,6 +42,7 @@ public final class DesktopConnection implements Closeable {
controlOutputStream = null;
}
videoFd = videoSocket.getFileDescriptor();
audioFd = audioSocket != null ? audioSocket.getFileDescriptor() : null;
}
private static LocalSocket connect(String abstractName) throws IOException {
@@ -55,40 +60,50 @@ public final class DesktopConnection implements Closeable {
return SOCKET_NAME_PREFIX + String.format("_%08x", uid);
}
public static DesktopConnection open(int uid, boolean tunnelForward, boolean control, boolean sendDummyByte) throws IOException {
public static DesktopConnection open(int uid, boolean tunnelForward, boolean audio, boolean control, boolean sendDummyByte) throws IOException {
String socketName = getSocketName(uid);
LocalSocket videoSocket;
LocalSocket videoSocket = null;
LocalSocket audioSocket = null;
LocalSocket controlSocket = null;
if (tunnelForward) {
try (LocalServerSocket localServerSocket = new LocalServerSocket(socketName)) {
videoSocket = localServerSocket.accept();
if (sendDummyByte) {
// send one byte so the client may read() to detect a connection error
videoSocket.getOutputStream().write(0);
}
if (control) {
try {
try {
if (tunnelForward) {
try (LocalServerSocket localServerSocket = new LocalServerSocket(socketName)) {
videoSocket = localServerSocket.accept();
if (sendDummyByte) {
// send one byte so the client may read() to detect a connection error
videoSocket.getOutputStream().write(0);
}
if (audio) {
audioSocket = localServerSocket.accept();
}
if (control) {
controlSocket = localServerSocket.accept();
} catch (IOException | RuntimeException e) {
videoSocket.close();
throw e;
}
}
}
} else {
videoSocket = connect(socketName);
if (control) {
try {
} else {
videoSocket = connect(socketName);
if (audio) {
audioSocket = connect(socketName);
}
if (control) {
controlSocket = connect(socketName);
} catch (IOException | RuntimeException e) {
videoSocket.close();
throw e;
}
}
} catch (IOException | RuntimeException e) {
if (videoSocket != null) {
videoSocket.close();
}
if (audioSocket != null) {
audioSocket.close();
}
if (controlSocket != null) {
controlSocket.close();
}
throw e;
}
return new DesktopConnection(videoSocket, controlSocket);
return new DesktopConnection(videoSocket, audioSocket, controlSocket);
}
public void close() throws IOException {
@@ -121,6 +136,10 @@ public final class DesktopConnection implements Closeable {
return videoFd;
}
public FileDescriptor getAudioFd() {
return audioFd;
}
public ControlMessage receiveControlMessage() throws IOException {
ControlMessage msg = reader.next();
while (msg == null) {

View File

@@ -277,6 +277,26 @@ public final class Device {
* @param mode one of the {@code POWER_MODE_*} constants
*/
public static boolean setScreenPowerMode(int mode) {
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.Q) {
// Change the power mode for all physical displays
long[] physicalDisplayIds = SurfaceControl.getPhysicalDisplayIds();
if (physicalDisplayIds == null) {
Ln.e("Could not get physical display ids");
return false;
}
boolean allOk = true;
for (long physicalDisplayId : physicalDisplayIds) {
IBinder binder = SurfaceControl.getPhysicalDisplayToken(physicalDisplayId);
boolean ok = SurfaceControl.setDisplayPowerMode(binder, mode);
if (!ok) {
allOk = false;
}
}
return allOk;
}
// Older Android versions, only 1 display
IBinder d = SurfaceControl.getBuiltInDisplay();
if (d == null) {
Ln.e("Could not get built-in display");

View File

@@ -6,6 +6,8 @@ public final class DeviceMessageSender {
private final DesktopConnection connection;
private Thread thread;
private String clipboardText;
private long ack;
@@ -24,7 +26,7 @@ public final class DeviceMessageSender {
notify();
}
public void loop() throws IOException, InterruptedException {
private void loop() throws IOException, InterruptedException {
while (!Thread.currentThread().isInterrupted()) {
String text;
long sequence;
@@ -49,4 +51,28 @@ public final class DeviceMessageSender {
}
}
}
public void start() {
thread = new Thread(() -> {
try {
loop();
} catch (IOException | InterruptedException e) {
// this is expected on close
} finally {
Ln.d("Device message sender stopped");
}
});
thread.start();
}
public void stop() {
if (thread != null) {
thread.interrupt();
}
}
public void join() throws InterruptedException {
if (thread != null) {
thread.join();
}
}
}

View File

@@ -5,9 +5,12 @@ import android.graphics.Rect;
import java.util.List;
public class Options {
private Ln.Level logLevel = Ln.Level.DEBUG;
private int uid = -1; // 31-bit non-negative value, or -1
private boolean audio = true;
private int maxSize;
private VideoCodec codec = VideoCodec.H264;
private int bitRate = 8000000;
private int maxFps;
private int lockVideoOrientation = -1;
@@ -29,6 +32,7 @@ public class Options {
private boolean sendDeviceMeta = true; // send device name and size
private boolean sendFrameMeta = true; // send PTS so that the client may record properly
private boolean sendDummyByte = true; // write a byte on start to detect connection issues
private boolean sendCodecId = true; // write the codec ID (4 bytes) before the stream
public Ln.Level getLogLevel() {
return logLevel;
@@ -46,6 +50,14 @@ public class Options {
this.uid = uid;
}
public boolean getAudio() {
return audio;
}
public void setAudio(boolean audio) {
this.audio = audio;
}
public int getMaxSize() {
return maxSize;
}
@@ -54,6 +66,14 @@ public class Options {
this.maxSize = maxSize;
}
public VideoCodec getCodec() {
return codec;
}
public void setCodec(VideoCodec codec) {
this.codec = codec;
}
public int getBitRate() {
return bitRate;
}
@@ -205,4 +225,12 @@ public class Options {
public void setSendDummyByte(boolean sendDummyByte) {
this.sendDummyByte = sendDummyByte;
}
public boolean getSendCodecId() {
return sendCodecId;
}
public void setSendCodecId(boolean sendCodecId) {
this.sendCodecId = sendCodecId;
}
}

View File

@@ -12,7 +12,6 @@ import android.os.IBinder;
import android.os.SystemClock;
import android.view.Surface;
import java.io.FileDescriptor;
import java.io.IOException;
import java.nio.ByteBuffer;
import java.util.ArrayList;
@@ -30,26 +29,23 @@ public class ScreenEncoder implements Device.RotationListener {
private static final int[] MAX_SIZE_FALLBACK = {2560, 1920, 1600, 1280, 1024, 800};
private static final int MAX_CONSECUTIVE_ERRORS = 3;
private static final long PACKET_FLAG_CONFIG = 1L << 63;
private static final long PACKET_FLAG_KEY_FRAME = 1L << 62;
private final AtomicBoolean rotationChanged = new AtomicBoolean();
private final ByteBuffer headerBuffer = ByteBuffer.allocate(12);
private final Device device;
private final Streamer streamer;
private final String encoderName;
private final List<CodecOption> codecOptions;
private final int bitRate;
private final int maxFps;
private final boolean sendFrameMeta;
private final boolean downsizeOnError;
private long ptsOrigin;
private boolean firstFrameSent;
private int consecutiveErrors;
public ScreenEncoder(boolean sendFrameMeta, int bitRate, int maxFps, List<CodecOption> codecOptions, String encoderName,
boolean downsizeOnError) {
this.sendFrameMeta = sendFrameMeta;
public ScreenEncoder(Device device, Streamer streamer, int bitRate, int maxFps, List<CodecOption> codecOptions,
String encoderName, boolean downsizeOnError) {
this.device = device;
this.streamer = streamer;
this.bitRate = bitRate;
this.maxFps = maxFps;
this.codecOptions = codecOptions;
@@ -66,11 +62,15 @@ public class ScreenEncoder implements Device.RotationListener {
return rotationChanged.getAndSet(false);
}
public void streamScreen(Device device, FileDescriptor fd) throws IOException {
MediaCodec codec = createCodec(encoderName);
MediaFormat format = createFormat(bitRate, maxFps, codecOptions);
public void streamScreen() throws IOException {
String videoMimeType = streamer.getCodec().getMimeType();
MediaCodec codec = createCodec(videoMimeType, encoderName);
MediaFormat format = createFormat(videoMimeType, bitRate, maxFps, codecOptions);
IBinder display = createDisplay();
device.setRotationListener(this);
streamer.writeHeader();
boolean alive;
try {
do {
@@ -95,7 +95,7 @@ public class ScreenEncoder implements Device.RotationListener {
codec.start();
alive = encode(codec, fd);
alive = encode(codec, streamer);
// do not call stop() on exception, it would trigger an IllegalStateException
codec.stop();
} catch (IllegalStateException | IllegalArgumentException e) {
@@ -164,31 +164,30 @@ public class ScreenEncoder implements Device.RotationListener {
return 0;
}
private boolean encode(MediaCodec codec, FileDescriptor fd) throws IOException {
private boolean encode(MediaCodec codec, Streamer streamer) throws IOException {
boolean eof = false;
MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
while (!consumeRotationChange() && !eof) {
int outputBufferId = codec.dequeueOutputBuffer(bufferInfo, -1);
eof = (bufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0;
try {
if (consumeRotationChange()) {
// must restart encoding with new size
break;
}
eof = (bufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0;
if (outputBufferId >= 0) {
ByteBuffer codecBuffer = codec.getOutputBuffer(outputBufferId);
if (sendFrameMeta) {
writeFrameMeta(fd, bufferInfo, codecBuffer.remaining());
}
IO.writeFully(fd, codecBuffer);
if ((bufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) == 0) {
boolean isConfig = (bufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0;
if (!isConfig) {
// If this is not a config packet, then it contains a frame
firstFrameSent = true;
consecutiveErrors = 0;
}
streamer.writePacket(codecBuffer, bufferInfo);
}
} finally {
if (outputBufferId >= 0) {
@@ -200,50 +199,28 @@ public class ScreenEncoder implements Device.RotationListener {
return !eof;
}
private void writeFrameMeta(FileDescriptor fd, MediaCodec.BufferInfo bufferInfo, int packetSize) throws IOException {
headerBuffer.clear();
long pts;
if ((bufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {
pts = PACKET_FLAG_CONFIG; // non-media data packet
} else {
if (ptsOrigin == 0) {
ptsOrigin = bufferInfo.presentationTimeUs;
}
pts = bufferInfo.presentationTimeUs - ptsOrigin;
if ((bufferInfo.flags & MediaCodec.BUFFER_FLAG_KEY_FRAME) != 0) {
pts |= PACKET_FLAG_KEY_FRAME;
}
}
headerBuffer.putLong(pts);
headerBuffer.putInt(packetSize);
headerBuffer.flip();
IO.writeFully(fd, headerBuffer);
}
private static MediaCodecInfo[] listEncoders() {
private static MediaCodecInfo[] listEncoders(String videoMimeType) {
List<MediaCodecInfo> result = new ArrayList<>();
MediaCodecList list = new MediaCodecList(MediaCodecList.REGULAR_CODECS);
for (MediaCodecInfo codecInfo : list.getCodecInfos()) {
if (codecInfo.isEncoder() && Arrays.asList(codecInfo.getSupportedTypes()).contains(MediaFormat.MIMETYPE_VIDEO_AVC)) {
if (codecInfo.isEncoder() && Arrays.asList(codecInfo.getSupportedTypes()).contains(videoMimeType)) {
result.add(codecInfo);
}
}
return result.toArray(new MediaCodecInfo[result.size()]);
}
private static MediaCodec createCodec(String encoderName) throws IOException {
private static MediaCodec createCodec(String videoMimeType, String encoderName) throws IOException {
if (encoderName != null) {
Ln.d("Creating encoder by name: '" + encoderName + "'");
try {
return MediaCodec.createByCodecName(encoderName);
} catch (IllegalArgumentException e) {
MediaCodecInfo[] encoders = listEncoders();
MediaCodecInfo[] encoders = listEncoders(videoMimeType);
throw new InvalidEncoderException(encoderName, encoders);
}
}
MediaCodec codec = MediaCodec.createEncoderByType(MediaFormat.MIMETYPE_VIDEO_AVC);
MediaCodec codec = MediaCodec.createEncoderByType(videoMimeType);
Ln.d("Using encoder: '" + codec.getName() + "'");
return codec;
}
@@ -265,9 +242,9 @@ public class ScreenEncoder implements Device.RotationListener {
Ln.d("Codec option set: " + key + " (" + value.getClass().getSimpleName() + ") = " + value);
}
private static MediaFormat createFormat(int bitRate, int maxFps, List<CodecOption> codecOptions) {
private static MediaFormat createFormat(String videoMimeType, int bitRate, int maxFps, List<CodecOption> codecOptions) {
MediaFormat format = new MediaFormat();
format.setString(MediaFormat.KEY_MIME, MediaFormat.MIMETYPE_VIDEO_AVC);
format.setString(MediaFormat.KEY_MIME, videoMimeType);
format.setInteger(MediaFormat.KEY_BIT_RATE, bitRate);
// must be present to configure the encoder, but does not impact the actual frame rate, which is variable
format.setInteger(MediaFormat.KEY_FRAME_RATE, 60);

View File

@@ -66,51 +66,84 @@ public final class Server {
Thread initThread = startInitThread(options);
Workarounds.prepareMainLooper();
if (Build.BRAND.equalsIgnoreCase("meizu")) {
// <https://github.com/Genymobile/scrcpy/issues/240>
// <https://github.com/Genymobile/scrcpy/issues/2656>
Workarounds.fillAppInfo();
}
int uid = options.getUid();
boolean tunnelForward = options.isTunnelForward();
boolean control = options.getControl();
boolean audio = options.getAudio();
boolean sendDummyByte = options.getSendDummyByte();
try (DesktopConnection connection = DesktopConnection.open(uid, tunnelForward, control, sendDummyByte)) {
Workarounds.prepareMainLooper();
// Workarounds must be applied for Meizu phones:
// - <https://github.com/Genymobile/scrcpy/issues/240>
// - <https://github.com/Genymobile/scrcpy/issues/365>
// - <https://github.com/Genymobile/scrcpy/issues/2656>
//
// But only apply when strictly necessary, since workarounds can cause other issues:
// - <https://github.com/Genymobile/scrcpy/issues/940>
// - <https://github.com/Genymobile/scrcpy/issues/994>
boolean mustFillAppInfo = Build.BRAND.equalsIgnoreCase("meizu");
// Before Android 11, audio is not supported.
// Since Android 12, we can properly set a context on the AudioRecord.
// Only on Android 11 we must fill app info for the AudioRecord to work.
mustFillAppInfo |= audio && Build.VERSION.SDK_INT == Build.VERSION_CODES.R;
if (mustFillAppInfo) {
Workarounds.fillAppInfo();
}
try (DesktopConnection connection = DesktopConnection.open(uid, tunnelForward, audio, control, sendDummyByte)) {
VideoCodec codec = options.getCodec();
if (options.getSendDeviceMeta()) {
Size videoSize = device.getScreenInfo().getVideoSize();
connection.sendDeviceMeta(Device.getDeviceName(), videoSize.getWidth(), videoSize.getHeight());
}
ScreenEncoder screenEncoder = new ScreenEncoder(options.getSendFrameMeta(), options.getBitRate(), options.getMaxFps(), codecOptions,
options.getEncoderName(), options.getDownsizeOnError());
Thread controllerThread = null;
Thread deviceMessageSenderThread = null;
Controller controller = null;
if (control) {
final Controller controller = new Controller(device, connection, options.getClipboardAutosync(), options.getPowerOn());
controller = new Controller(device, connection, options.getClipboardAutosync(), options.getPowerOn());
controller.start();
// asynchronous
controllerThread = startController(controller);
deviceMessageSenderThread = startDeviceMessageSender(controller.getSender());
device.setClipboardListener(text -> controller.getSender().pushClipboardText(text));
final Controller controllerRef = controller;
device.setClipboardListener(text -> controllerRef.getSender().pushClipboardText(text));
}
AudioEncoder audioEncoder = null;
if (audio) {
Streamer audioStreamer = new Streamer(connection.getAudioFd(), AudioCodec.OPUS, options.getSendCodecId(), options.getSendFrameMeta());
audioEncoder = new AudioEncoder(audioStreamer);
audioEncoder.start();
}
Streamer videoStreamer = new Streamer(connection.getVideoFd(), codec, options.getSendCodecId(), options.getSendFrameMeta());
ScreenEncoder screenEncoder = new ScreenEncoder(device, videoStreamer, options.getBitRate(), options.getMaxFps(),
codecOptions, options.getEncoderName(), options.getDownsizeOnError());
try {
// synchronous
screenEncoder.streamScreen(device, connection.getVideoFd());
screenEncoder.streamScreen();
} catch (IOException e) {
// this is expected on close
Ln.d("Screen streaming stopped");
} finally {
Ln.d("Screen streaming stopped");
initThread.interrupt();
if (controllerThread != null) {
controllerThread.interrupt();
if (audioEncoder != null) {
audioEncoder.stop();
}
if (deviceMessageSenderThread != null) {
deviceMessageSenderThread.interrupt();
if (controller != null) {
controller.stop();
}
try {
initThread.join();
if (audioEncoder != null) {
audioEncoder.join();
}
if (controller != null) {
controller.join();
}
} catch (InterruptedException e) {
// ignore
}
}
}
@@ -122,32 +155,6 @@ public final class Server {
return thread;
}
private static Thread startController(final Controller controller) {
Thread thread = new Thread(() -> {
try {
controller.control();
} catch (IOException e) {
// this is expected on close
Ln.d("Controller stopped");
}
});
thread.start();
return thread;
}
private static Thread startDeviceMessageSender(final DeviceMessageSender sender) {
Thread thread = new Thread(() -> {
try {
sender.loop();
} catch (IOException | InterruptedException e) {
// this is expected on close
Ln.d("Device message sender stopped");
}
});
thread.start();
return thread;
}
private static Options createOptions(String... args) {
if (args.length < 1) {
throw new IllegalArgumentException("Missing client version");
@@ -181,6 +188,17 @@ public final class Server {
Ln.Level level = Ln.Level.valueOf(value.toUpperCase(Locale.ENGLISH));
options.setLogLevel(level);
break;
case "audio":
boolean audio = Boolean.parseBoolean(value);
options.setAudio(audio);
break;
case "codec":
VideoCodec codec = VideoCodec.findByName(value);
if (codec == null) {
throw new IllegalArgumentException("Video codec " + value + " not supported");
}
options.setCodec(codec);
break;
case "max_size":
int maxSize = Integer.parseInt(value) & ~7; // multiple of 8
options.setMaxSize(maxSize);
@@ -262,12 +280,17 @@ public final class Server {
boolean sendDummyByte = Boolean.parseBoolean(value);
options.setSendDummyByte(sendDummyByte);
break;
case "send_codec_id":
boolean sendCodecId = Boolean.parseBoolean(value);
options.setSendCodecId(sendCodecId);
break;
case "raw_video_stream":
boolean rawVideoStream = Boolean.parseBoolean(value);
if (rawVideoStream) {
options.setSendDeviceMeta(false);
options.setSendFrameMeta(false);
options.setSendDummyByte(false);
options.setSendCodecId(false);
}
break;
default:

View File

@@ -0,0 +1,67 @@
package com.genymobile.scrcpy;
import android.media.MediaCodec;
import java.io.FileDescriptor;
import java.io.IOException;
import java.nio.ByteBuffer;
public final class Streamer {
private static final long PACKET_FLAG_CONFIG = 1L << 63;
private static final long PACKET_FLAG_KEY_FRAME = 1L << 62;
private final FileDescriptor fd;
private final Codec codec;
private final boolean sendCodecId;
private final boolean sendFrameMeta;
private final ByteBuffer headerBuffer = ByteBuffer.allocate(12);
public Streamer(FileDescriptor fd, Codec codec, boolean sendCodecId, boolean sendFrameMeta) {
this.fd = fd;
this.codec = codec;
this.sendCodecId = sendCodecId;
this.sendFrameMeta = sendFrameMeta;
}
public Codec getCodec() {
return codec;
}
public void writeHeader() throws IOException {
if (sendCodecId) {
ByteBuffer buffer = ByteBuffer.allocate(4);
buffer.putInt(codec.getId());
buffer.flip();
IO.writeFully(fd, buffer);
}
}
public void writePacket(ByteBuffer codecBuffer, MediaCodec.BufferInfo bufferInfo) throws IOException {
if (sendFrameMeta) {
writeFrameMeta(fd, bufferInfo, codecBuffer.remaining());
}
IO.writeFully(fd, codecBuffer);
}
private void writeFrameMeta(FileDescriptor fd, MediaCodec.BufferInfo bufferInfo, int packetSize) throws IOException {
headerBuffer.clear();
long ptsAndFlags;
if ((bufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {
ptsAndFlags = PACKET_FLAG_CONFIG; // non-media data packet
} else {
ptsAndFlags = bufferInfo.presentationTimeUs;
if ((bufferInfo.flags & MediaCodec.BUFFER_FLAG_KEY_FRAME) != 0) {
ptsAndFlags |= PACKET_FLAG_KEY_FRAME;
}
}
headerBuffer.putLong(ptsAndFlags);
headerBuffer.putInt(packetSize);
headerBuffer.flip();
IO.writeFully(fd, headerBuffer);
}
}

View File

@@ -0,0 +1,48 @@
package com.genymobile.scrcpy;
import android.media.MediaFormat;
public enum VideoCodec implements Codec {
H264(0x68_32_36_34, "h264", MediaFormat.MIMETYPE_VIDEO_AVC),
H265(0x68_32_36_35, "h265", MediaFormat.MIMETYPE_VIDEO_HEVC),
AV1(0x00_61_76_31, "av1", MediaFormat.MIMETYPE_VIDEO_AV1);
private final int id; // 4-byte ASCII representation of the name
private final String name;
private final String mimeType;
VideoCodec(int id, String name, String mimeType) {
this.id = id;
this.name = name;
this.mimeType = mimeType;
}
@Override
public Type getType() {
return Type.VIDEO;
}
@Override
public int getId() {
return id;
}
@Override
public String getName() {
return name;
}
@Override
public String getMimeType() {
return mimeType;
}
public static VideoCodec findByName(String name) {
for (VideoCodec codec : values()) {
if (codec.name.equals(name)) {
return codec;
}
}
return null;
}
}

View File

@@ -3,6 +3,7 @@ package com.genymobile.scrcpy;
import android.annotation.SuppressLint;
import android.app.Application;
import android.app.Instrumentation;
import android.content.ContextWrapper;
import android.content.pm.ApplicationInfo;
import android.os.Looper;
@@ -60,7 +61,10 @@ public final class Workarounds {
mBoundApplicationField.setAccessible(true);
mBoundApplicationField.set(activityThread, appBindData);
Application app = Instrumentation.newApplication(Application.class, FakeContext.get());
Application app = Application.class.newInstance();
Field baseField = ContextWrapper.class.getDeclaredField("mBase");
baseField.setAccessible(true);
baseField.set(app, FakeContext.get());
// activityThread.mInitialApplication = app;
Field mInitialApplicationField = activityThreadClass.getDeclaredField("mInitialApplication");

View File

@@ -30,6 +30,8 @@ public final class SurfaceControl {
private static Method getBuiltInDisplayMethod;
private static Method setDisplayPowerModeMethod;
private static Method getPhysicalDisplayTokenMethod;
private static Method getPhysicalDisplayIdsMethod;
private SurfaceControl() {
// only static methods
@@ -98,7 +100,6 @@ public final class SurfaceControl {
}
public static IBinder getBuiltInDisplay() {
try {
Method method = getGetBuiltInDisplayMethod();
if (Build.VERSION.SDK_INT < Build.VERSION_CODES.Q) {
@@ -114,6 +115,40 @@ public final class SurfaceControl {
}
}
private static Method getGetPhysicalDisplayTokenMethod() throws NoSuchMethodException {
if (getPhysicalDisplayTokenMethod == null) {
getPhysicalDisplayTokenMethod = CLASS.getMethod("getPhysicalDisplayToken", long.class);
}
return getPhysicalDisplayTokenMethod;
}
public static IBinder getPhysicalDisplayToken(long physicalDisplayId) {
try {
Method method = getGetPhysicalDisplayTokenMethod();
return (IBinder) method.invoke(null, physicalDisplayId);
} catch (InvocationTargetException | IllegalAccessException | NoSuchMethodException e) {
Ln.e("Could not invoke method", e);
return null;
}
}
private static Method getGetPhysicalDisplayIdsMethod() throws NoSuchMethodException {
if (getPhysicalDisplayIdsMethod == null) {
getPhysicalDisplayIdsMethod = CLASS.getMethod("getPhysicalDisplayIds");
}
return getPhysicalDisplayIdsMethod;
}
public static long[] getPhysicalDisplayIds() {
try {
Method method = getGetPhysicalDisplayIdsMethod();
return (long[]) method.invoke(null);
} catch (InvocationTargetException | IllegalAccessException | NoSuchMethodException e) {
Ln.e("Could not invoke method", e);
return null;
}
}
private static Method getSetDisplayPowerModeMethod() throws NoSuchMethodException {
if (setDisplayPowerModeMethod == null) {
setDisplayPowerModeMethod = CLASS.getMethod("setDisplayPowerMode", IBinder.class, int.class);