Compare commits

..

37 Commits

Author SHA1 Message Date
Romain Vimont
8c154d53d6 Add support for AAC audio codec
Add option --audio-codec=aac.
2023-02-18 19:30:59 +01:00
Romain Vimont
105b482082 Add option to select audio codec
Introduce the selection mechanism. Alternative codecs will be added
later.
2023-02-18 19:30:14 +01:00
Romain Vimont
4dbb7aa685 Add --audio-bit-rate option
Add an option to configure the audio bit-rate.
2023-02-18 19:10:29 +01:00
Romain Vimont
637683ed7a Remove default bit-rate on client side
If no bit-rate is passed, let the server use the default value (8Mbps).

This avoids to define a default value on both sides, and to pass the
default bit-rate as an argument when starting the server.
2023-02-18 19:10:29 +01:00
Romain Vimont
166257b808 Disable audio if device could not capture
By default, audio is enabled (--no-audio must be explicitly passed to
disable it).

However, some devices may not support audio capture (typically devices
below Android 11, or Android 11 when the shell application is not
foreground on start).

In that case, make the server notify the client to dynamically disable
audio forwarding so that it does not wait indefinitely for an audio
stream.
2023-02-18 19:10:28 +01:00
Romain Vimont
3038bf4c3b Add record audio support
Make the recorder accept two input sources (video and audio), and mux
them into a single file.
2023-02-18 19:10:28 +01:00
Romain Vimont
b7af74b01a Rename video-specific variables in recorder
This paves the way to add audio-specific variables.
2023-02-18 19:10:28 +01:00
Romain Vimont
3f3e60dd7a Do not merge config audio packets
For video streams (at least H.264 and H.265), the config packet
containing SPS/PPS must be prepended to the next packet (the following
keyframe).

For audio streams (at least OPUS), they must not be merged.
2023-02-18 19:10:28 +01:00
Romain Vimont
d722b9de4d Add an audio demuxer
Add a demuxer which will read the stream from the audio socket.
2023-02-18 19:10:28 +01:00
Romain Vimont
6a3d311d47 Rename demuxer to video_demuxer
There will be another demuxer instance for audio.
2023-02-18 19:10:28 +01:00
Romain Vimont
a1b97fbfaa Extract OPUS extradata
For OPUS codec, FFmpeg expects the raw extradata, but MediaCodec wraps
it in some structure.

Fix the config packet to send only the raw extradata.
2023-02-18 19:10:28 +01:00
Romain Vimont
5d5a1d6888 Use a streamer to send the audio stream
Send each encoded audio packet using a streamer.
2023-02-18 19:10:28 +01:00
Romain Vimont
c19fefbcbd Encode recorded audio in OPUS on the device
For now, the encoded packets are just logged in the console.
2023-02-18 19:10:28 +01:00
Simon Chan
bc0f51023e Capture device audio
Create an AudioRecorder to capture the audio source REMOTE_SUBMIX.

For now, the captured packets are just logged in the console.

Co-authored-by: Romain Vimont <rom@rom1v.com>
Signed-off-by: Romain Vimont <rom@rom1v.com>
2023-02-18 19:10:28 +01:00
Simon Chan
3a8a7d5ba8 Add a new socket for audio stream
When audio is enabled, open a new socket to send the audio stream from
the device to the client.

Co-authored-by: Romain Vimont <rom@rom1v.com>
Signed-off-by: Romain Vimont <rom@rom1v.com>
2023-02-18 19:10:27 +01:00
Simon Chan
1fe2afb4df Add --no-audio option
Audio will be enabled by default (when supported). Add an option to
disable it.

Co-authored-by: Romain Vimont <rom@rom1v.com>
Signed-off-by: Romain Vimont <rom@rom1v.com>
2023-02-18 19:10:27 +01:00
Romain Vimont
47d0c189ec Use FakeContext for Application instance
This will expose the correct package name and UID to the application
context.
2023-02-18 19:10:27 +01:00
Romain Vimont
c76264c149 Use shell package name for workarounds
For consistency.
2023-02-18 19:10:27 +01:00
Romain Vimont
be09e65fa4 Use PACKAGE_NAME from FakeContext
Remove duplicated constant.
2023-02-18 19:10:27 +01:00
Romain Vimont
de8a9a4045 Use AttributionSource from FakeContext
FakeContext already provides an AttributeSource instance.

Co-authored-by: Simon Chan <1330321+yume-chan@users.noreply.github.com>
2023-02-18 19:10:27 +01:00
Simon Chan
4e47d78f30 Add a fake Android Context
Since scrcpy-server is not an Android application (it's a java
executable), it has no Context.

Some features will require a Context instance to get the package name
and the UID. Add a FakeContext for this purpose.

Co-authored-by: Romain Vimont <rom@rom1v.com>
Signed-off-by: Romain Vimont <rom@rom1v.com>
2023-02-18 19:10:27 +01:00
Romain Vimont
0873b08c85 Use Process.ROOT_UID
Replace ServiceManager.USER_ID by existing constant Process.ROOT_UID.
2023-02-18 19:10:27 +01:00
Romain Vimont
83c64d918b Make streamer independent of codec type
Rename VideoStreamer to Streamer, and extract a Codec interface which
will also support audio codecs.
2023-02-18 19:10:26 +01:00
Romain Vimont
aedd90a955 Pass all args to ScreenEncoder constructor
There is no good reason to pass some of them in the constructor and some
others as parameters of the streamScreen() method.
2023-02-18 19:10:26 +01:00
Romain Vimont
e153f4657c Move screen encoder initialization
This prepares further refactors.
2023-02-18 19:10:26 +01:00
Romain Vimont
dd2959b0cb Write streamer header from ScreenEncoder
The screen encoder is responsible to write data to the video streamer.
2023-02-18 19:10:26 +01:00
Romain Vimont
6118c2b4cd Use VideoStreamer directly from ScreenEncoder
The Callbacks interface notifies new packets. But in addition, the
screen encoder will need to write headers on start.

We could add a function onStart(), but for simplicity, just remove the
interface, which brings no value, and call the streamer directly.

Refs 87972e2022
2023-02-18 19:10:26 +01:00
Romain Vimont
fad8c68090 Simplify error handling on socket creation
On any error, all previously opened sockets must be closed.

Handle these errors in a single catch-block. Currently, there are only 2
sockets, but this will simplify even more with more sockets.

Note: this commit is better displayed with --ignore-space-change (-b).
2023-02-18 19:10:26 +01:00
Romain Vimont
9b5b75d800 Refactor recorder logic
Process the initial config packet (necessary to write the header)
separately.
2023-02-18 19:10:26 +01:00
Romain Vimont
1270617422 Move last packet recording
Write the last packet at the end.
2023-02-18 19:10:26 +01:00
Romain Vimont
0cbbf82917 Open recording file from the recorder thread
The recorder opened the target file from the packet sink open()
callback, called by the demuxer. Only then the recorder thread was
started.

One golden rule for the recorder is to never block the demuxer for I/O,
because it would impact mirroring. This rule is respected on recording
packets, but not for the initial recorder opening.

Therefore, start the recorder thread from sc_recorder_init(), open the
file immediately from the recorder thread, then make it wait for the
stream to start (on packet sink open()).

Now that the recorder can report errors directly (rather than making the
demuxer call fail), it is possible to report file opening error even
before the packet sink is open.
2023-02-18 19:10:26 +01:00
Romain Vimont
25b3ad844c Inline packet_sink impl in recorder
Remove useless wrappers.
2023-02-18 19:10:26 +01:00
Romain Vimont
689314578e Initialize recorder fields from init()
The recorder has two initialization phases: one to initialize the
concrete recorder object, and one to open its packet_sink trait.

Initialize mutex and condvar as part of the object initialization.

If there were several packet_sink traits, the mutex and condvar would
still be initialized only once.
2023-02-18 19:10:26 +01:00
Romain Vimont
af61c7340d Report recorder errors
Stop scrcpy on recorder errors.

It was previously indirectly stopped by the demuxer, which failed to
push packets to a recorder in error. Report it directly instead:
 - it avoids to wait for the next demuxer call;
 - it will allow to open the target file from a separate thread and stop
   immediately on any I/O error.
2023-02-18 19:10:25 +01:00
Romain Vimont
f0c991537f Move previous packet to a local variable
It is only used from run_recorder().
2023-02-18 19:10:25 +01:00
Romain Vimont
6b542c8ceb Move pts_origin to a local variable
It is only used from run_recorder().
2023-02-18 19:10:25 +01:00
Romain Vimont
01b647b910 Change PTS origin type from uint64_t to int64_t
It is initialized from AVPacket.pts, which is an int64_t.
2023-02-18 19:10:25 +01:00
57 changed files with 586 additions and 1930 deletions

View File

@@ -199,7 +199,7 @@ preserved. That way, a device in 1920×1080 will be mirrored at 1024×576.
The default bit-rate is 8 Mbps. To change the video bitrate (e.g. to 2 Mbps):
```bash
scrcpy --video-bit-rate=2M
scrcpy --bit-rate=2M
scrcpy -b 2M # short version
```
@@ -258,9 +258,9 @@ The video codec can be selected. The possible values are `h264` (default),
`h265` and `av1`:
```bash
scrcpy --video-codec=h264 # default
scrcpy --video-codec=h265
scrcpy --video-codec=av1
scrcpy --codec=h264 # default
scrcpy --codec=h265
scrcpy --codec=av1
```
@@ -270,13 +270,15 @@ Some devices have more than one encoder for a specific codec, and some of them
may cause issues or crash. It is possible to select a different encoder:
```bash
scrcpy --video-encoder=OMX.qcom.video.encoder.avc
scrcpy --encoder=OMX.qcom.video.encoder.avc
```
To list the available encoders:
To list the available encoders, you can pass an invalid encoder name; the
error will give the available encoders:
```bash
scrcpy --list-encoders
scrcpy --encoder=_ # for the default codec
scrcpy --codec=h265 --encoder=_ # for a specific codec
```
### Capture
@@ -442,7 +444,7 @@ none found, try running `adb disconnect`, and then run those two commands again.
It may be useful to decrease the bit-rate and the resolution:
```bash
scrcpy --video-bit-rate=2M --max-size=800
scrcpy --bit-rate=2M --max-size=800
scrcpy -b2M -m800 # short version
```

View File

@@ -2,33 +2,28 @@ _scrcpy() {
local cur prev words cword
local opts="
--always-on-top
--audio-bit-rate=
--audio-codec=
--audio-codec-options=
--audio-encoder=
-b --video-bit-rate=
-b --bit-rate=
--codec-options=
--crop=
-d --select-usb
--disable-screensaver
--display=
--display-buffer=
-e --select-tcpip
--encoder=
--force-adb-forward
--forward-all-clicks
-f --fullscreen
-K --hid-keyboard
-h --help
--legacy-paste
--list-displays
--list-encoders
--lock-video-orientation
--lock-video-orientation=
--max-fps=
-M --hid-mouse
-m --max-size=
--no-audio
--no-cleanup
--no-clipboard-autosync
--no-clipboard-on-error
--no-downsize-on-error
-n --no-control
-N --no-display
@@ -58,9 +53,6 @@ _scrcpy() {
--v4l2-sink=
-V --verbosity=
-v --version
--video-codec=
--video-codec-options=
--video-encoder=
-w --stay-awake
--window-borderless
--window-title=
@@ -72,14 +64,6 @@ _scrcpy() {
_init_completion -s || return
case "$prev" in
--video-codec)
COMPREPLY=($(compgen -W 'h264 h265 av1' -- "$cur"))
return
;;
--audio-codec)
COMPREPLY=($(compgen -W 'opus aac' -- "$cur"))
return
;;
--lock-video-orientation)
COMPREPLY=($(compgen -W 'unlocked initial 0 1 2 3' -- "$cur"))
return
@@ -114,7 +98,7 @@ _scrcpy() {
COMPREPLY=($(compgen -W "$("${ADB:-adb}" devices | awk '$2 == "device" {print $1}')" -- ${cur}))
return
;;
-b|--video-bit-rate \
-b|--bitrate \
|--codec-options \
|--crop \
|--display \

View File

@@ -9,30 +9,25 @@ local arguments
arguments=(
'--always-on-top[Make scrcpy window always on top \(above other windows\)]'
'--audio-bit-rate=[Encode the audio at the given bit-rate]'
'--audio-codec=[Select the audio codec]:codec:(opus aac)'
'--audio-codec-options=[Set a list of comma-separated key\:type=value options for the device audio encoder]'
'--audio-encoder=[Use a specific MediaCodec audio encoder]'
{-b,--video-bit-rate=}'[Encode the video at the given bit-rate]'
{-b,--bit-rate=}'[Encode the video at the given bit-rate]'
'--codec-options=[Set a list of comma-separated key\:type=value options for the device encoder]'
'--crop=[\[width\:height\:x\:y\] Crop the device screen on the server]'
{-d,--select-usb}'[Use USB device]'
'--disable-screensaver[Disable screensaver while scrcpy is running]'
'--display=[Specify the display id to mirror]'
'--display-buffer=[Add a buffering delay \(in milliseconds\) before displaying]'
{-e,--select-tcpip}'[Use TCP/IP device]'
'--encoder=[Use a specific MediaCodec encoder \(must be a H.264 encoder\)]'
'--force-adb-forward[Do not attempt to use \"adb reverse\" to connect to the device]'
'--forward-all-clicks[Forward clicks to device]'
{-f,--fullscreen}'[Start in fullscreen]'
{-K,--hid-keyboard}'[Simulate a physical keyboard by using HID over AOAv2]'
{-h,--help}'[Print the help]'
'--legacy-paste[Inject computer clipboard text as a sequence of key events on Ctrl+v]'
'--list-displays[List displays available on the device]'
'--list-encoders[List video and audio encoders available on the device]'
'--lock-video-orientation=[Lock video orientation]:orientation:(unlocked initial 0 1 2 3)'
'--max-fps=[Limit the frame rate of screen capture]'
{-M,--hid-mouse}'[Simulate a physical mouse by using HID over AOAv2]'
{-m,--max-size=}'[Limit both the width and height of the video to value]'
'--no-audio[Disable audio forwarding]'
'--no-cleanup[Disable device cleanup actions on exit]'
'--no-clipboard-autosync[Disable automatic clipboard synchronization]'
'--no-downsize-on-error[Disable lowering definition on MediaCodec error]'
@@ -63,9 +58,6 @@ arguments=(
'--v4l2-sink=[\[\/dev\/videoN\] Output to v4l2loopback device]'
{-V,--verbosity=}'[Set the log level]:verbosity:(verbose debug info warn error)'
{-v,--version}'[Print the version of scrcpy]'
'--video-codec=[Select the video codec]:codec:(h264 h265 av1)'
'--video-codec-options=[Set a list of comma-separated key\:type=value options for the device video encoder]'
'--video-encoder=[Use a specific MediaCodec video encoder]'
{-w,--stay-awake}'[Keep the device on while scrcpy is running, when the device is plugged in]'
'--window-borderless[Disable window decorations \(display borderless window\)]'
'--window-title=[Set a custom window title]'

View File

@@ -4,7 +4,6 @@ src = [
'src/adb/adb_device.c',
'src/adb/adb_parser.c',
'src/adb/adb_tunnel.c',
'src/audio_player.c',
'src/cli.c',
'src/clock.c',
'src/compat.c',
@@ -31,8 +30,6 @@ src = [
'src/version.c',
'src/video_buffer.c',
'src/util/acksync.c',
'src/util/average.c',
'src/util/bytebuf.c',
'src/util/file.c',
'src/util/intmap.c',
'src/util/intr.c',
@@ -102,7 +99,6 @@ if not crossbuild_windows
dependency('libavformat', version: '>= 57.33'),
dependency('libavcodec', version: '>= 57.37'),
dependency('libavutil'),
dependency('libswresample'),
dependency('sdl2', version: '>= 2.0.5'),
]
@@ -137,14 +133,12 @@ else
ffmpeg_avcodec = meson.get_cross_property('ffmpeg_avcodec')
ffmpeg_avformat = meson.get_cross_property('ffmpeg_avformat')
ffmpeg_avutil = meson.get_cross_property('ffmpeg_avutil')
ffmpeg_swresample = meson.get_cross_property('ffmpeg_swresample')
ffmpeg = declare_dependency(
dependencies: [
cc.find_library(ffmpeg_avcodec, dirs: ffmpeg_bin_dir),
cc.find_library(ffmpeg_avformat, dirs: ffmpeg_bin_dir),
cc.find_library(ffmpeg_avutil, dirs: ffmpeg_bin_dir),
cc.find_library(ffmpeg_swresample, dirs: ffmpeg_bin_dir),
],
include_directories: include_directories(ffmpeg_include_dir)
)
@@ -266,10 +260,6 @@ if get_option('buildtype') == 'debug'
['test_binary', [
'tests/test_binary.c',
]],
['test_bytebuf', [
'tests/test_bytebuf.c',
'src/util/bytebuf.c',
]],
['test_cbuf', [
'tests/test_cbuf.c',
]],

View File

@@ -32,17 +32,27 @@ Select an audio codec (opus or aac).
Default is opus.
.TP
.BI "\-\-audio\-encoder " name
Use a specific MediaCodec audio encoder (depending on the codec provided by \fB\-\-audio\-codec\fR).
The available encoders can be listed by \-\-list\-encoders.
.TP
.BI "\-b, \-\-video\-bit\-rate " value
.BI "\-b, \-\-bit\-rate " value
Encode the video at the given bit\-rate, expressed in bits/s. Unit suffixes are supported: '\fBK\fR' (x1000) and '\fBM\fR' (x1000000).
Default is 8M (8000000).
.TP
.BI "\-\-codec " name
Select a video codec (h264, h265 or av1).
Default is h264.
.TP
.BI "\-\-codec\-options " key\fR[:\fItype\fR]=\fIvalue\fR[,...]
Set a list of comma-separated key:type=value options for the device encoder.
The possible values for 'type' are 'int' (default), 'long', 'float' and 'string'.
The list of possible codec options is available in the Android documentation
.UR https://d.android.com/reference/android/media/MediaFormat
.UE .
.TP
.BI "\-\-crop " width\fR:\fIheight\fR:\fIx\fR:\fIy
Crop the device screen on the server.
@@ -63,9 +73,10 @@ Disable screensaver while scrcpy is running.
.TP
.BI "\-\-display " id
Specify the device display id to mirror.
Specify the display id to mirror.
The available display ids can be listed by \-\-list\-displays.
The list of possible display ids can be listed by "adb shell dumpsys display"
(search "mDisplayId=" in the output).
Default is 0.
@@ -81,6 +92,10 @@ Use TCP/IP device (if there is exactly one, like adb -e).
Also see \fB\-d\fR (\fB\-\-select\-usb\fR).
.TP
.BI "\-\-encoder " name
Use a specific MediaCodec encoder (must be a H.264 encoder).
.TP
.B \-\-force\-adb\-forward
Do not attempt to use "adb reverse" to connect to the device.
@@ -119,14 +134,6 @@ Inject computer clipboard text as a sequence of key events on Ctrl+v (like MOD+S
This is a workaround for some devices not behaving as expected when setting the device clipboard programmatically.
.TP
.B \-\-list\-encoders
List video and audio encoders available on the device.
.TP
.B \-\-list\-displays
List displays available on the device.
.TP
\fB\-\-lock\-video\-orientation\fR[=\fIvalue\fR]
Lock video orientation to \fIvalue\fR. Possible values are "unlocked", "initial" (locked to the initial orientation), 0, 1, 2 and 3. Natural device orientation is 0, and each increment adds a 90 degrees rotation counterclockwise.
@@ -334,28 +341,6 @@ Default is "info" for release builds, "debug" for debug builds.
.B \-v, \-\-version
Print the version of scrcpy.
.TP
.BI "\-\-video\-codec " name
Select a video codec (h264, h265 or av1).
Default is h264.
.TP
.BI "\-\-video\-codec\-options " key\fR[:\fItype\fR]=\fIvalue\fR[,...]
Set a list of comma-separated key:type=value options for the device video encoder.
The possible values for 'type' are 'int' (default), 'long', 'float' and 'string'.
The list of possible codec options is available in the Android documentation
.UR https://d.android.com/reference/android/media/MediaFormat
.UE .
.TP
.BI "\-\-video\-encoder " name
Use a specific MediaCodec video encoder (depending on the codec provided by \fB\-\-video\-codec\fR).
The available encoders can be listed by \-\-list\-encoders.
.TP
.B \-w, \-\-stay-awake
Keep the device on while scrcpy is running, when the device is plugged in.

View File

@@ -1,304 +0,0 @@
#include "audio_player.h"
#include <libavutil/opt.h>
#include "util/log.h"
#define SC_AUDIO_PLAYER_NDEBUG // comment to debug
/** Downcast frame_sink to sc_audio_player */
#define DOWNCAST(SINK) container_of(SINK, struct sc_audio_player, frame_sink)
#define SC_AV_SAMPLE_FMT AV_SAMPLE_FMT_FLT
#define SC_SDL_SAMPLE_FMT AUDIO_F32
#define SC_AUDIO_OUTPUT_BUFFER_SAMPLES 480 // 10ms at 48000Hz
// The target number of buffered samples between the producer and the consumer.
// This value is directly use for compensation.
#define SC_TARGET_BUFFERED_SAMPLES (3 * SC_AUDIO_OUTPUT_BUFFER_SAMPLES)
// If the consumer is too late, skip samples to keep at most this value
#define SC_BUFFERED_SAMPLES_THRESHOLD 2400 // 50ms at 48000Hz
// Use a ring-buffer of 1 second (at 48000Hz) between the producer and the
// consumer. It too big, but it guarantees that the producer and the consumer
// will be able to access it in parallel without locking.
#define SC_BYTEBUF_SIZE_IN_SAMPLES 48000
void
sc_audio_player_sdl_callback(void *userdata, uint8_t *stream, int len_int) {
struct sc_audio_player *ap = userdata;
// This callback is called with the lock used by SDL_AudioDeviceLock(), so
// the bytebuf is protected
assert(len_int > 0);
size_t len = len_int;
#ifndef SC_AUDIO_PLAYER_NDEBUG
LOGD("[Audio] SDL callback requests %" SC_PRIsizet " samples",
len / (ap->nb_channels * ap->out_bytes_per_sample));
#endif
size_t read = sc_bytebuf_read_remaining(&ap->buf);
size_t max_buffered_bytes = SC_BUFFERED_SAMPLES_THRESHOLD
* ap->nb_channels * ap->out_bytes_per_sample;
if (read > max_buffered_bytes + len) {
size_t skip = read - (max_buffered_bytes + len);
#ifndef SC_AUDIO_PLAYER_NDEBUG
LOGD("[Audio] Buffered samples threshold exceeded: %" SC_PRIsizet
" bytes, skipping %" SC_PRIsizet " bytes", read, skip);
#endif
// After this callback, exactly max_buffered_bytes will remain
sc_bytebuf_skip(&ap->buf, skip);
read = max_buffered_bytes + len;
}
// Number of buffered samples (may be negative on underflow)
float buffered_samples = ((float) read - len_int)
/ (ap->nb_channels * ap->out_bytes_per_sample);
sc_average_push(&ap->avg_buffered_samples, buffered_samples);
if (read) {
if (read > len) {
read = len;
}
sc_bytebuf_read(&ap->buf, stream, read);
}
if (read < len) {
// Insert silence
#ifndef SC_AUDIO_PLAYER_NDEBUG
LOGD("[Audio] Buffer underflow, inserting silence: %" SC_PRIsizet
" bytes", len - read);
#endif
memset(stream + read, 0, len - read);
}
}
static size_t
sc_audio_player_get_buf_size(struct sc_audio_player *ap, size_t samples) {
assert(ap->nb_channels);
assert(ap->out_bytes_per_sample);
return samples * ap->nb_channels * ap->out_bytes_per_sample;
}
static uint8_t *
sc_audio_player_get_swr_buf(struct sc_audio_player *ap, size_t min_samples) {
size_t min_buf_size = sc_audio_player_get_buf_size(ap, min_samples);
if (min_buf_size < ap->swr_buf_alloc_size) {
size_t new_size = min_buf_size + 4096;
uint8_t *buf = realloc(ap->swr_buf, new_size);
if (!buf) {
LOG_OOM();
// Could not realloc to the requested size
return NULL;
}
ap->swr_buf = buf;
ap->swr_buf_alloc_size = new_size;
}
return ap->swr_buf;
}
static bool
sc_audio_player_frame_sink_open(struct sc_frame_sink *sink,
const AVCodecContext *ctx) {
struct sc_audio_player *ap = DOWNCAST(sink);
#ifdef SCRCPY_LAVU_HAS_CHLAYOUT
assert(ctx->ch_layout.nb_channels > 0);
unsigned nb_channels = ctx->ch_layout.nb_channels;
#else
int tmp = av_get_channel_layout_nb_channels(ctx->channel_layout);
assert(tmp > 0);
unsigned nb_channels = tmp;
#endif
SDL_AudioSpec desired = {
.freq = ctx->sample_rate,
.format = SC_SDL_SAMPLE_FMT,
.channels = nb_channels,
.samples = SC_AUDIO_OUTPUT_BUFFER_SAMPLES,
.callback = sc_audio_player_sdl_callback,
.userdata = ap,
};
SDL_AudioSpec obtained;
ap->device = SDL_OpenAudioDevice(NULL, 0, &desired, &obtained, 0);
if (!ap->device) {
LOGE("Could not open audio device: %s", SDL_GetError());
return false;
}
SwrContext *swr_ctx = swr_alloc();
if (!swr_ctx) {
LOG_OOM();
goto error_close_audio_device;
}
ap->swr_ctx = swr_ctx;
assert(ctx->sample_rate > 0);
assert(!av_sample_fmt_is_planar(SC_AV_SAMPLE_FMT));
int out_bytes_per_sample = av_get_bytes_per_sample(SC_AV_SAMPLE_FMT);
assert(out_bytes_per_sample > 0);
#ifdef SCRCPY_LAVU_HAS_CHLAYOUT
av_opt_set_chlayout(swr_ctx, "in_chlayout", &ctx->ch_layout, 0);
av_opt_set_chlayout(swr_ctx, "out_chlayout", &ctx->ch_layout, 0);
#else
av_opt_set_channel_layout(swr_ctx, "in_channel_layout",
ctx->channel_layout, 0);
av_opt_set_channel_layout(swr_ctx, "out_channel_layout",
ctx->channel_layout, 0);
#endif
av_opt_set_int(swr_ctx, "in_sample_rate", ctx->sample_rate, 0);
av_opt_set_int(swr_ctx, "out_sample_rate", ctx->sample_rate, 0);
av_opt_set_sample_fmt(swr_ctx, "in_sample_fmt", ctx->sample_fmt, 0);
av_opt_set_sample_fmt(swr_ctx, "out_sample_fmt", SC_AV_SAMPLE_FMT, 0);
int ret = swr_init(swr_ctx);
if (ret) {
LOGE("Failed to initialize the resampling context");
goto error_free_swr_ctx;
}
ap->sample_rate = ctx->sample_rate;
ap->nb_channels = nb_channels;
ap->out_bytes_per_sample = out_bytes_per_sample;
size_t bytebuf_size =
sc_audio_player_get_buf_size(ap, SC_BYTEBUF_SIZE_IN_SAMPLES);
bool ok = sc_bytebuf_init(&ap->buf, bytebuf_size);
if (!ok) {
goto error_free_swr_ctx;
}
ap->safe_empty_buffer = sc_bytebuf_write_remaining(&ap->buf);
size_t initial_swr_buf_size = sc_audio_player_get_buf_size(ap, 4096);
ap->swr_buf = malloc(initial_swr_buf_size);
if (!ap->swr_buf) {
LOG_OOM();
goto error_destroy_bytebuf;
}
ap->swr_buf_alloc_size = initial_swr_buf_size;
sc_average_init(&ap->avg_buffered_samples, 32);
ap->samples_since_resync = 0;
SDL_PauseAudioDevice(ap->device, 0);
return true;
error_destroy_bytebuf:
sc_bytebuf_destroy(&ap->buf);
error_free_swr_ctx:
swr_free(&ap->swr_ctx);
error_close_audio_device:
SDL_CloseAudioDevice(ap->device);
return false;
}
static void
sc_audio_player_frame_sink_close(struct sc_frame_sink *sink) {
struct sc_audio_player *ap = DOWNCAST(sink);
assert(ap->device);
SDL_PauseAudioDevice(ap->device, 1);
SDL_CloseAudioDevice(ap->device);
free(ap->swr_buf);
sc_bytebuf_destroy(&ap->buf);
swr_free(&ap->swr_ctx);
}
static bool
sc_audio_player_frame_sink_push(struct sc_frame_sink *sink, const AVFrame *frame) {
struct sc_audio_player *ap = DOWNCAST(sink);
SwrContext *swr_ctx = ap->swr_ctx;
int64_t delay = swr_get_delay(swr_ctx, ap->sample_rate);
// No need to av_rescale_rnd(), input and output sample rates are the same
int dst_nb_samples = delay + frame->nb_samples;
uint8_t *swr_buf = sc_audio_player_get_swr_buf(ap, frame->nb_samples);
if (!swr_buf) {
return false;
}
int ret = swr_convert(swr_ctx, &swr_buf, dst_nb_samples,
(const uint8_t **) frame->data, frame->nb_samples);
if (ret < 0) {
LOGE("Resampling failed: %d", ret);
return false;
}
size_t samples_written = ret;
size_t swr_buf_size = sc_audio_player_get_buf_size(ap, samples_written);
#ifndef SC_AUDIO_PLAYER_NDEBUG
LOGI("[Audio] %" SC_PRIsizet " samples written to buffer", samples_written);
#endif
// It should almost always be possible to write without lock
bool can_write_without_lock = swr_buf_size <= ap->safe_empty_buffer;
if (can_write_without_lock) {
sc_bytebuf_prepare_write(&ap->buf, swr_buf, swr_buf_size);
}
SDL_LockAudioDevice(ap->device);
if (can_write_without_lock) {
sc_bytebuf_commit_write(&ap->buf, swr_buf_size);
} else {
sc_bytebuf_write(&ap->buf, swr_buf, swr_buf_size);
}
// The next time, it will remain at least the current empty space
ap->safe_empty_buffer = sc_bytebuf_write_remaining(&ap->buf);
// Read the value written by the SDL thread under lock
float avg;
bool has_avg = sc_average_get(&ap->avg_buffered_samples, &avg);
SDL_UnlockAudioDevice(ap->device);
if (has_avg) {
ap->samples_since_resync += samples_written;
if (ap->samples_since_resync >= ap->sample_rate) {
// Resync every second
ap->samples_since_resync = 0;
int diff = SC_TARGET_BUFFERED_SAMPLES - avg;
#ifndef SC_AUDIO_PLAYER_NDEBUG
LOGI("[Audio] Average buffered samples = %f, compensation %d",
avg, diff);
#endif
// Compensate the diff over 3 seconds (but will be recomputed after
// 1 second)
int ret = swr_set_compensation(swr_ctx, diff, 3 * ap->sample_rate);
if (ret < 0) {
LOGW("Resampling compensation failed: %d", ret);
// not fatal
}
}
}
return true;
}
void
sc_audio_player_init(struct sc_audio_player *ap) {
static const struct sc_frame_sink_ops ops = {
.open = sc_audio_player_frame_sink_open,
.close = sc_audio_player_frame_sink_close,
.push = sc_audio_player_frame_sink_push,
};
ap->frame_sink.ops = &ops;
}

View File

@@ -1,54 +0,0 @@
#ifndef SC_AUDIO_PLAYER_H
#define SC_AUDIO_PLAYER_H
#include "common.h"
#include <stdbool.h>
#include "trait/frame_sink.h"
#include <util/average.h>
#include <util/bytebuf.h>
#include <util/thread.h>
#include <libavformat/avformat.h>
#include <libswresample/swresample.h>
#include <SDL2/SDL.h>
struct sc_audio_player {
struct sc_frame_sink frame_sink;
SDL_AudioDeviceID device;
// protected by SDL_AudioDeviceLock()
struct sc_bytebuf buf;
// Number of bytes which could be written without locking
size_t safe_empty_buffer;
struct SwrContext *swr_ctx;
// The sample rate is the same for input and output
unsigned sample_rate;
// The number of channels is the same for input and output
unsigned nb_channels;
unsigned out_bytes_per_sample;
// Target buffer for resampling
uint8_t *swr_buf;
size_t swr_buf_alloc_size;
// Number of buffered samples (may be negative on underflow)
struct sc_average avg_buffered_samples;
unsigned samples_since_resync;
const struct sc_audio_player_callbacks *cbs;
void *cbs_userdata;
};
struct sc_audio_player_callbacks {
void (*on_ended)(struct sc_audio_player *ap, bool success, void *userdata);
};
void
sc_audio_player_init(struct sc_audio_player *ap);
#endif

View File

@@ -17,59 +17,50 @@
#define STR_IMPL_(x) #x
#define STR(x) STR_IMPL_(x)
enum {
OPT_RENDER_EXPIRED_FRAMES = 1000,
OPT_WINDOW_TITLE,
OPT_PUSH_TARGET,
OPT_ALWAYS_ON_TOP,
OPT_CROP,
OPT_RECORD_FORMAT,
OPT_PREFER_TEXT,
OPT_WINDOW_X,
OPT_WINDOW_Y,
OPT_WINDOW_WIDTH,
OPT_WINDOW_HEIGHT,
OPT_WINDOW_BORDERLESS,
OPT_MAX_FPS,
OPT_LOCK_VIDEO_ORIENTATION,
OPT_DISPLAY_ID,
OPT_ROTATION,
OPT_RENDER_DRIVER,
OPT_NO_MIPMAPS,
OPT_CODEC_OPTIONS,
OPT_VIDEO_CODEC_OPTIONS,
OPT_FORCE_ADB_FORWARD,
OPT_DISABLE_SCREENSAVER,
OPT_SHORTCUT_MOD,
OPT_NO_KEY_REPEAT,
OPT_FORWARD_ALL_CLICKS,
OPT_LEGACY_PASTE,
OPT_ENCODER,
OPT_VIDEO_ENCODER,
OPT_POWER_OFF_ON_CLOSE,
OPT_V4L2_SINK,
OPT_DISPLAY_BUFFER,
OPT_V4L2_BUFFER,
OPT_TUNNEL_HOST,
OPT_TUNNEL_PORT,
OPT_NO_CLIPBOARD_AUTOSYNC,
OPT_TCPIP,
OPT_RAW_KEY_EVENTS,
OPT_NO_DOWNSIZE_ON_ERROR,
OPT_OTG,
OPT_NO_CLEANUP,
OPT_PRINT_FPS,
OPT_NO_POWER_ON,
OPT_CODEC,
OPT_VIDEO_CODEC,
OPT_NO_AUDIO,
OPT_AUDIO_BIT_RATE,
OPT_AUDIO_CODEC,
OPT_AUDIO_CODEC_OPTIONS,
OPT_AUDIO_ENCODER,
OPT_LIST_ENCODERS,
OPT_LIST_DISPLAYS,
};
#define OPT_RENDER_EXPIRED_FRAMES 1000
#define OPT_WINDOW_TITLE 1001
#define OPT_PUSH_TARGET 1002
#define OPT_ALWAYS_ON_TOP 1003
#define OPT_CROP 1004
#define OPT_RECORD_FORMAT 1005
#define OPT_PREFER_TEXT 1006
#define OPT_WINDOW_X 1007
#define OPT_WINDOW_Y 1008
#define OPT_WINDOW_WIDTH 1009
#define OPT_WINDOW_HEIGHT 1010
#define OPT_WINDOW_BORDERLESS 1011
#define OPT_MAX_FPS 1012
#define OPT_LOCK_VIDEO_ORIENTATION 1013
#define OPT_DISPLAY_ID 1014
#define OPT_ROTATION 1015
#define OPT_RENDER_DRIVER 1016
#define OPT_NO_MIPMAPS 1017
#define OPT_CODEC_OPTIONS 1018
#define OPT_FORCE_ADB_FORWARD 1019
#define OPT_DISABLE_SCREENSAVER 1020
#define OPT_SHORTCUT_MOD 1021
#define OPT_NO_KEY_REPEAT 1022
#define OPT_FORWARD_ALL_CLICKS 1023
#define OPT_LEGACY_PASTE 1024
#define OPT_ENCODER_NAME 1025
#define OPT_POWER_OFF_ON_CLOSE 1026
#define OPT_V4L2_SINK 1027
#define OPT_DISPLAY_BUFFER 1028
#define OPT_V4L2_BUFFER 1029
#define OPT_TUNNEL_HOST 1030
#define OPT_TUNNEL_PORT 1031
#define OPT_NO_CLIPBOARD_AUTOSYNC 1032
#define OPT_TCPIP 1033
#define OPT_RAW_KEY_EVENTS 1034
#define OPT_NO_DOWNSIZE_ON_ERROR 1035
#define OPT_OTG 1036
#define OPT_NO_CLEANUP 1037
#define OPT_PRINT_FPS 1038
#define OPT_NO_POWER_ON 1039
#define OPT_CODEC 1040
#define OPT_NO_AUDIO 1041
#define OPT_AUDIO_BIT_RATE 1042
#define OPT_AUDIO_CODEC 1043
struct sc_option {
char shortopt;
@@ -125,48 +116,32 @@ static const struct sc_option options[] = {
.text = "Select an audio codec (opus or aac).\n"
"Default is opus.",
},
{
.longopt_id = OPT_AUDIO_CODEC_OPTIONS,
.longopt = "audio-codec-options",
.argdesc = "key[:type]=value[,...]",
.text = "Set a list of comma-separated key:type=value options for the "
"device audio encoder.\n"
"The possible values for 'type' are 'int' (default), 'long', "
"'float' and 'string'.\n"
"The list of possible codec options is available in the "
"Android documentation: "
"<https://d.android.com/reference/android/media/MediaFormat>",
},
{
.longopt_id = OPT_AUDIO_ENCODER,
.longopt = "audio-encoder",
.argdesc = "name",
.text = "Use a specific MediaCodec audio encoder (depending on the "
"codec provided by --audio-codec).\n"
"The available encoders can be listed by --list-encoders.",
},
{
.shortopt = 'b',
.longopt = "video-bit-rate",
.longopt = "bit-rate",
.argdesc = "value",
.text = "Encode the video at the given bit-rate, expressed in bits/s. "
"Unit suffixes are supported: 'K' (x1000) and 'M' (x1000000).\n"
"Default is 8M (8000000).",
},
{
// Not really deprecated (--codec has never been released), but without
// declaring an explicit --codec option, getopt_long() partial matching
// behavior would consider --codec to be equivalent to --codec-options,
// which would be confusing.
.longopt_id = OPT_CODEC,
.longopt = "codec",
.argdesc = "value",
.argdesc = "name",
.text = "Select a video codec (h264, h265 or av1).\n"
"Default is h264.",
},
{
// deprecated
.longopt_id = OPT_CODEC_OPTIONS,
.longopt = "codec-options",
.argdesc = "key[:type]=value[,...]",
.text = "Set a list of comma-separated key:type=value options for the "
"device encoder.\n"
"The possible values for 'type' are 'int' (default), 'long', "
"'float' and 'string'.\n"
"The list of possible codec options is available in the "
"Android documentation: "
"<https://d.android.com/reference/android/media/MediaFormat>",
},
{
.longopt_id = OPT_CROP,
@@ -192,9 +167,10 @@ static const struct sc_option options[] = {
.longopt_id = OPT_DISPLAY_ID,
.longopt = "display",
.argdesc = "id",
.text = "Specify the device display id to mirror.\n"
"The available display ids can be listed by:\n"
" scrcpy --list-displays\n"
.text = "Specify the display id to mirror.\n"
"The list of possible display ids can be listed by:\n"
" adb shell dumpsys display\n"
"(search \"mDisplayId=\" in the output)\n"
"Default is 0.",
},
{
@@ -212,10 +188,10 @@ static const struct sc_option options[] = {
"Also see -d (--select-usb).",
},
{
// deprecated
.longopt_id = OPT_ENCODER,
.longopt_id = OPT_ENCODER_NAME,
.longopt = "encoder",
.argdesc = "name",
.text = "Use a specific MediaCodec encoder (must be a H.264 encoder).",
},
{
.longopt_id = OPT_FORCE_ADB_FORWARD,
@@ -265,16 +241,6 @@ static const struct sc_option options[] = {
"This is a workaround for some devices not behaving as "
"expected when setting the device clipboard programmatically.",
},
{
.longopt_id = OPT_LIST_DISPLAYS,
.longopt = "list-displays",
.text = "List device displays.",
},
{
.longopt_id = OPT_LIST_ENCODERS,
.longopt = "list-encoders",
.text = "List video and audio encoders available on the device.",
},
{
.longopt_id = OPT_LOCK_VIDEO_ORIENTATION,
.longopt = "lock-video-orientation",
@@ -316,11 +282,6 @@ static const struct sc_option options[] = {
"is preserved.\n"
"Default is 0 (unlimited).",
},
{
.longopt_id = OPT_NO_AUDIO,
.longopt = "no-audio",
.text = "Disable audio forwarding.",
},
{
.longopt_id = OPT_NO_CLEANUP,
.longopt = "no-cleanup",
@@ -356,6 +317,11 @@ static const struct sc_option options[] = {
.text = "Do not display device (only when screen recording or V4L2 "
"sink is enabled).",
},
{
.longopt_id = OPT_NO_AUDIO,
.longopt = "no-audio",
.text = "Disable audio forwarding.",
},
{
.longopt_id = OPT_NO_KEY_REPEAT,
.longopt = "no-key-repeat",
@@ -567,33 +533,6 @@ static const struct sc_option options[] = {
.longopt = "version",
.text = "Print the version of scrcpy.",
},
{
.longopt_id = OPT_VIDEO_CODEC,
.longopt = "video-codec",
.argdesc = "name",
.text = "Select a video codec (h264, h265 or av1).\n"
"Default is h264.",
},
{
.longopt_id = OPT_VIDEO_CODEC_OPTIONS,
.longopt = "video-codec-options",
.argdesc = "key[:type]=value[,...]",
.text = "Set a list of comma-separated key:type=value options for the "
"device video encoder.\n"
"The possible values for 'type' are 'int' (default), 'long', "
"'float' and 'string'.\n"
"The list of possible codec options is available in the "
"Android documentation: "
"<https://d.android.com/reference/android/media/MediaFormat>",
},
{
.longopt_id = OPT_VIDEO_ENCODER,
.longopt = "video-encoder",
.argdesc = "name",
.text = "Use a specific MediaCodec video encoder (depending on the "
"codec provided by --video-codec).\n"
"The available encoders can be listed by --list-encoders.",
},
{
.shortopt = 'w',
.longopt = "stay-awake",
@@ -1470,7 +1409,7 @@ guess_record_format(const char *filename) {
}
static bool
parse_video_codec(const char *optarg, enum sc_codec *codec) {
parse_codec(const char *optarg, enum sc_codec *codec) {
if (!strcmp(optarg, "h264")) {
*codec = SC_CODEC_H264;
return true;
@@ -1483,7 +1422,7 @@ parse_video_codec(const char *optarg, enum sc_codec *codec) {
*codec = SC_CODEC_AV1;
return true;
}
LOGE("Unsupported video codec: %s (expected h264, h265 or av1)", optarg);
LOGE("Unsupported codec: %s (expected h264, h265 or av1)", optarg);
return false;
}
@@ -1512,7 +1451,7 @@ parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
while ((c = getopt_long(argc, argv, optstring, longopts, NULL)) != -1) {
switch (c) {
case 'b':
if (!parse_bit_rate(optarg, &opts->video_bit_rate)) {
if (!parse_bit_rate(optarg, &opts->bit_rate)) {
return false;
}
break;
@@ -1690,23 +1629,10 @@ parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
opts->forward_key_repeat = false;
break;
case OPT_CODEC_OPTIONS:
LOGW("--codec-options is deprecated, use --video-codec-options "
"instead.");
// fall through
case OPT_VIDEO_CODEC_OPTIONS:
opts->video_codec_options = optarg;
opts->codec_options = optarg;
break;
case OPT_AUDIO_CODEC_OPTIONS:
opts->audio_codec_options = optarg;
break;
case OPT_ENCODER:
LOGW("--encoder is deprecated, use --video-encoder instead.");
// fall through
case OPT_VIDEO_ENCODER:
opts->video_encoder = optarg;
break;
case OPT_AUDIO_ENCODER:
opts->audio_encoder = optarg;
case OPT_ENCODER_NAME:
opts->encoder_name = optarg;
break;
case OPT_FORCE_ADB_FORWARD:
opts->force_adb_forward = true;
@@ -1756,10 +1682,7 @@ parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
opts->start_fps_counter = true;
break;
case OPT_CODEC:
LOGW("--codec is deprecated, use --video-codec instead.");
// fall through
case OPT_VIDEO_CODEC:
if (!parse_video_codec(optarg, &opts->video_codec)) {
if (!parse_codec(optarg, &opts->codec)) {
return false;
}
break;
@@ -1795,12 +1718,6 @@ parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
LOGE("V4L2 (--v4l2-buffer) is only available on Linux.");
return false;
#endif
case OPT_LIST_ENCODERS:
opts->list_encoders = true;
break;
case OPT_LIST_DISPLAYS:
opts->list_displays = true;
break;
default:
// getopt prints the error message on stderr
return false;
@@ -1882,6 +1799,13 @@ parse_args_with_getopt(struct scrcpy_cli_args *args, int argc, char *argv[],
}
}
if (opts->record_format == SC_RECORD_FORMAT_MP4
&& opts->codec == SC_CODEC_AV1) {
LOGE("Could not mux AV1 stream into MP4 container "
"(record to mkv or select another video codec)");
return false;
}
if (!opts->control) {
if (opts->turn_screen_off) {
LOGE("Could not request to turn screen off if control is disabled");

View File

@@ -37,13 +37,6 @@
# define SCRCPY_LAVF_HAS_AVFORMATCONTEXT_URL
#endif
// Not documented in ffmpeg/doc/APIchanges, but the channel_layout API
// has been replaced by chlayout in FFmpeg commit
// f423497b455da06c1337846902c770028760e094.
#if LIBAVUTIL_VERSION_INT >= AV_VERSION_INT(57, 23, 100)
# define SCRCPY_LAVU_HAS_CHLAYOUT
#endif
#if SDL_VERSION_ATLEAST(2, 0, 6)
// <https://github.com/libsdl-org/SDL/commit/d7a318de563125e5bb465b1000d6bc9576fbc6fc>
# define SCRCPY_SDL_HAS_HINT_TOUCH_MOUSE_EVENTS

View File

@@ -2,7 +2,6 @@
#include <libavcodec/avcodec.h>
#include <libavformat/avformat.h>
#include <libavutil/channel_layout.h>
#include "events.h"
#include "video_buffer.h"
@@ -26,10 +25,11 @@ sc_decoder_close_sinks(struct sc_decoder *decoder) {
}
static bool
sc_decoder_open_sinks(struct sc_decoder *decoder, const AVCodecContext *ctx) {
sc_decoder_open_sinks(struct sc_decoder *decoder) {
for (unsigned i = 0; i < decoder->sink_count; ++i) {
struct sc_frame_sink *sink = decoder->sinks[i];
if (!sink->ops->open(sink, ctx)) {
if (!sink->ops->open(sink)) {
LOGE("Could not open frame sink %d", i);
sc_decoder_close_first_sinks(decoder, i);
return false;
}
@@ -48,23 +48,8 @@ sc_decoder_open(struct sc_decoder *decoder, const AVCodec *codec) {
decoder->codec_ctx->flags |= AV_CODEC_FLAG_LOW_DELAY;
if (codec->type == AVMEDIA_TYPE_VIDEO) {
// Hardcoded video properties
decoder->codec_ctx->pix_fmt = AV_PIX_FMT_YUV420P;
} else {
// Hardcoded audio properties
#ifdef SCRCPY_LAVU_HAS_CHLAYOUT
decoder->codec_ctx->ch_layout =
(AVChannelLayout) AV_CHANNEL_LAYOUT_STEREO;
#else
decoder->codec_ctx->channel_layout = AV_CH_LAYOUT_STEREO;
decoder->codec_ctx->channels = 2;
#endif
decoder->codec_ctx->sample_rate = 48000;
}
if (avcodec_open2(decoder->codec_ctx, codec, NULL) < 0) {
LOGE("Decoder '%s': could not open codec", decoder->name);
LOGE("Could not open codec");
avcodec_free_context(&decoder->codec_ctx);
return false;
}
@@ -77,7 +62,8 @@ sc_decoder_open(struct sc_decoder *decoder, const AVCodec *codec) {
return false;
}
if (!sc_decoder_open_sinks(decoder, decoder->codec_ctx)) {
if (!sc_decoder_open_sinks(decoder)) {
LOGE("Could not open decoder sinks");
av_frame_free(&decoder->frame);
avcodec_close(decoder->codec_ctx);
avcodec_free_context(&decoder->codec_ctx);
@@ -100,6 +86,7 @@ push_frame_to_sinks(struct sc_decoder *decoder, const AVFrame *frame) {
for (unsigned i = 0; i < decoder->sink_count; ++i) {
struct sc_frame_sink *sink = decoder->sinks[i];
if (!sink->ops->push(sink, frame)) {
LOGE("Could not send frame to sink %d", i);
return false;
}
}
@@ -117,8 +104,7 @@ sc_decoder_push(struct sc_decoder *decoder, const AVPacket *packet) {
int ret = avcodec_send_packet(decoder->codec_ctx, packet);
if (ret < 0 && ret != AVERROR(EAGAIN)) {
LOGE("Decoder '%s': could not send video packet: %d",
decoder->name, ret);
LOGE("Could not send video packet: %d", ret);
return false;
}
ret = avcodec_receive_frame(decoder->codec_ctx, decoder->frame);
@@ -131,8 +117,7 @@ sc_decoder_push(struct sc_decoder *decoder, const AVPacket *packet) {
av_frame_unref(decoder->frame);
} else if (ret != AVERROR(EAGAIN)) {
LOGE("Decoder '%s', could not receive video frame: %d",
decoder->name, ret);
LOGE("Could not receive video frame: %d", ret);
return false;
}
return true;
@@ -158,8 +143,7 @@ sc_decoder_packet_sink_push(struct sc_packet_sink *sink,
}
void
sc_decoder_init(struct sc_decoder *decoder, const char *name) {
decoder->name = name; // statically allocated
sc_decoder_init(struct sc_decoder *decoder) {
decoder->sink_count = 0;
static const struct sc_packet_sink_ops ops = {

View File

@@ -14,8 +14,6 @@
struct sc_decoder {
struct sc_packet_sink packet_sink; // packet sink trait
const char *name; // must be statically allocated (e.g. a string literal)
struct sc_frame_sink *sinks[SC_DECODER_MAX_SINKS];
unsigned sink_count;
@@ -23,9 +21,8 @@ struct sc_decoder {
AVFrame *frame;
};
// The name must be statically allocated (e.g. a string literal)
void
sc_decoder_init(struct sc_decoder *decoder, const char *name);
sc_decoder_init(struct sc_decoder *decoder);
void
sc_decoder_add_sink(struct sc_decoder *decoder, struct sc_frame_sink *sink);

View File

@@ -117,6 +117,7 @@ push_packet_to_sinks(struct sc_demuxer *demuxer, const AVPacket *packet) {
for (unsigned i = 0; i < demuxer->sink_count; ++i) {
struct sc_packet_sink *sink = demuxer->sinks[i];
if (!sink->ops->push(sink, packet)) {
LOGE("Could not send packet to sink %d", i);
return false;
}
}
@@ -128,7 +129,7 @@ static bool
sc_demuxer_push_packet(struct sc_demuxer *demuxer, AVPacket *packet) {
bool ok = push_packet_to_sinks(demuxer, packet);
if (!ok) {
LOGE("Demuxer '%s': could not process packet", demuxer->name);
LOGE("Could not process packet");
return false;
}
@@ -153,6 +154,7 @@ sc_demuxer_open_sinks(struct sc_demuxer *demuxer, const AVCodec *codec) {
for (unsigned i = 0; i < demuxer->sink_count; ++i) {
struct sc_packet_sink *sink = demuxer->sinks[i];
if (!sink->ops->open(sink, codec)) {
LOGE("Could not open packet sink %d", i);
sc_demuxer_close_first_sinks(demuxer, i);
return false;
}
@@ -181,43 +183,31 @@ run_demuxer(void *data) {
uint32_t raw_codec_id;
bool ok = sc_demuxer_recv_codec_id(demuxer, &raw_codec_id);
if (!ok) {
LOGE("Demuxer '%s': stream disabled due to connection error",
demuxer->name);
eos = true;
goto end;
}
if (raw_codec_id == 0) {
LOGW("Demuxer '%s': stream explicitly disabled by the device",
demuxer->name);
// Stream explicitly disabled by the device
sc_demuxer_disable_sinks(demuxer);
eos = true;
goto end;
}
if (raw_codec_id == 1) {
LOGE("Demuxer '%s': stream configuration error on the device",
demuxer->name);
goto end;
}
enum AVCodecID codec_id = sc_demuxer_to_avcodec_id(raw_codec_id);
if (codec_id == AV_CODEC_ID_NONE) {
LOGE("Demuxer '%s': stream disabled due to unsupported codec",
demuxer->name);
sc_demuxer_disable_sinks(demuxer);
// Error already logged
goto end;
}
const AVCodec *codec = avcodec_find_decoder(codec_id);
if (!codec) {
LOGE("Demuxer '%s': stream disabled due to missing decoder",
demuxer->name);
sc_demuxer_disable_sinks(demuxer);
LOGE("Decoder not found");
goto end;
}
if (!sc_demuxer_open_sinks(demuxer, codec)) {
LOGE("Could not open demuxer sinks");
goto end;
}
@@ -262,7 +252,7 @@ run_demuxer(void *data) {
}
}
LOGD("Demuxer '%s': end of frames", demuxer->name);
LOGD("End of frames");
if (must_merge_config_packet) {
sc_packet_merger_destroy(&merger);
@@ -278,11 +268,10 @@ end:
}
void
sc_demuxer_init(struct sc_demuxer *demuxer, const char *name, sc_socket socket,
sc_demuxer_init(struct sc_demuxer *demuxer, sc_socket socket,
const struct sc_demuxer_callbacks *cbs, void *cbs_userdata) {
assert(socket != SC_SOCKET_NONE);
demuxer->name = name; // statically allocated
demuxer->socket = socket;
demuxer->sink_count = 0;
@@ -302,12 +291,12 @@ sc_demuxer_add_sink(struct sc_demuxer *demuxer, struct sc_packet_sink *sink) {
bool
sc_demuxer_start(struct sc_demuxer *demuxer) {
LOGD("Demuxer '%s': starting thread", demuxer->name);
LOGD("Starting demuxer thread");
bool ok = sc_thread_create(&demuxer->thread, run_demuxer, "scrcpy-demuxer",
demuxer);
if (!ok) {
LOGE("Demuxer '%s': could not start thread", demuxer->name);
LOGE("Could not start demuxer thread");
return false;
}
return true;

View File

@@ -15,8 +15,6 @@
#define SC_DEMUXER_MAX_SINKS 2
struct sc_demuxer {
const char *name; // must be statically allocated (e.g. a string literal)
sc_socket socket;
sc_thread thread;
@@ -31,9 +29,8 @@ struct sc_demuxer_callbacks {
void (*on_ended)(struct sc_demuxer *demuxer, bool eos, void *userdata);
};
// The name must be statically allocated (e.g. a string literal)
void
sc_demuxer_init(struct sc_demuxer *demuxer, const char *name, sc_socket socket,
sc_demuxer_init(struct sc_demuxer *demuxer, sc_socket socket,
const struct sc_demuxer_callbacks *cbs, void *cbs_userdata);
void

View File

@@ -69,7 +69,7 @@ decode_image(const char *path) {
}
if (avformat_open_input(&ctx, path, NULL, NULL) < 0) {
LOGE("Could not open icon image: %s", path);
LOGE("Could not open image codec: %s", path);
goto free_ctx;
}

View File

@@ -4,6 +4,10 @@
#include <stdbool.h>
#include <unistd.h>
#include <libavformat/avformat.h>
#ifdef _WIN32
#include <windows.h>
#include "util/str.h"
#endif
#ifdef HAVE_V4L2
# include <libavdevice/avdevice.h>
#endif
@@ -15,14 +19,8 @@
#include "scrcpy.h"
#include "usb/scrcpy_otg.h"
#include "util/log.h"
#include "util/net.h"
#include "version.h"
#ifdef _WIN32
#include <windows.h>
#include "util/str.h"
#endif
int
main_scrcpy(int argc, char *argv[]) {
#ifdef _WIN32
@@ -71,12 +69,10 @@ main_scrcpy(int argc, char *argv[]) {
}
#endif
if (!net_init()) {
if (avformat_network_init()) {
return SCRCPY_EXIT_FAILURE;
}
sc_log_configure();
#ifdef HAVE_USB
enum scrcpy_exit_code ret = args.opts.otg ? scrcpy_otg(&args.opts)
: scrcpy(&args.opts);
@@ -84,6 +80,8 @@ main_scrcpy(int argc, char *argv[]) {
enum scrcpy_exit_code ret = scrcpy(&args.opts);
#endif
avformat_network_deinit(); // ignore failure
return ret;
}

View File

@@ -7,19 +7,16 @@ const struct scrcpy_options scrcpy_options_default = {
.window_title = NULL,
.push_target = NULL,
.render_driver = NULL,
.video_codec_options = NULL,
.audio_codec_options = NULL,
.video_encoder = NULL,
.audio_encoder = NULL,
.codec_options = NULL,
.encoder_name = NULL,
#ifdef HAVE_V4L2
.v4l2_device = NULL,
#endif
.log_level = SC_LOG_LEVEL_INFO,
.video_codec = SC_CODEC_H264,
.codec = SC_CODEC_H264,
.audio_codec = SC_CODEC_OPUS,
.record_format = SC_RECORD_FORMAT_AUTO,
.keyboard_input_mode = SC_KEYBOARD_INPUT_MODE_INJECT,
.mouse_input_mode = SC_MOUSE_INPUT_MODE_INJECT,
.port_range = {
.first = DEFAULT_LOCAL_PORT_RANGE_FIRST,
.last = DEFAULT_LOCAL_PORT_RANGE_LAST,
@@ -31,7 +28,7 @@ const struct scrcpy_options scrcpy_options_default = {
.count = 2,
},
.max_size = 0,
.video_bit_rate = 0,
.bit_rate = 0,
.audio_bit_rate = 0,
.max_fps = 0,
.lock_video_orientation = SC_LOCK_VIDEO_ORIENTATION_UNLOCKED,
@@ -72,6 +69,4 @@ const struct scrcpy_options scrcpy_options_default = {
.start_fps_counter = false,
.power_on = true,
.audio = true,
.list_encoders = false,
.list_displays = false,
};

View File

@@ -95,15 +95,13 @@ struct scrcpy_options {
const char *window_title;
const char *push_target;
const char *render_driver;
const char *video_codec_options;
const char *audio_codec_options;
const char *video_encoder;
const char *audio_encoder;
const char *codec_options;
const char *encoder_name;
#ifdef HAVE_V4L2
const char *v4l2_device;
#endif
enum sc_log_level log_level;
enum sc_codec video_codec;
enum sc_codec codec;
enum sc_codec audio_codec;
enum sc_record_format record_format;
enum sc_keyboard_input_mode keyboard_input_mode;
@@ -113,7 +111,7 @@ struct scrcpy_options {
uint16_t tunnel_port;
struct sc_shortcut_mods shortcut_mods;
uint16_t max_size;
uint32_t video_bit_rate;
uint32_t bit_rate;
uint32_t audio_bit_rate;
uint16_t max_fps;
enum sc_lock_video_orientation lock_video_orientation;
@@ -154,8 +152,6 @@ struct scrcpy_options {
bool start_fps_counter;
bool power_on;
bool audio;
bool list_encoders;
bool list_displays;
};
extern const struct scrcpy_options scrcpy_options_default;

View File

@@ -216,12 +216,7 @@ sc_recorder_wait_audio_stream(struct sc_recorder *recorder) {
stream->codecpar->codec_type = AVMEDIA_TYPE_AUDIO;
stream->codecpar->codec_id = codec->id;
#ifdef SCRCPY_LAVU_HAS_CHLAYOUT
stream->codecpar->ch_layout.nb_channels = 2;
#else
stream->codecpar->channel_layout = AV_CH_LAYOUT_STEREO;
stream->codecpar->channels = 2;
#endif
stream->codecpar->sample_rate = 48000;
recorder->audio_stream_index = stream->index;
@@ -254,9 +249,7 @@ sc_recorder_process_header(struct sc_recorder *recorder) {
sc_cond_wait(&recorder->queue_cond, &recorder->mutex);
}
if (recorder->stopped && sc_queue_is_empty(&recorder->video_queue)) {
// If the recorder is stopped, don't process anything if there are not
// at least video packets
if (recorder->stopped && sc_recorder_has_empty_queues(recorder)) {
sc_mutex_unlock(&recorder->mutex);
return false;
}
@@ -264,9 +257,8 @@ sc_recorder_process_header(struct sc_recorder *recorder) {
struct sc_record_packet *video_pkt;
sc_queue_take(&recorder->video_queue, next, &video_pkt);
struct sc_record_packet *audio_pkt = NULL;
if (!sc_queue_is_empty(&recorder->audio_queue)) {
assert(recorder->audio);
struct sc_record_packet *audio_pkt;
if (recorder->audio) {
sc_queue_take(&recorder->audio_queue, next, &audio_pkt);
}
@@ -287,7 +279,7 @@ sc_recorder_process_header(struct sc_recorder *recorder) {
goto end;
}
if (audio_pkt) {
if (recorder->audio) {
if (audio_pkt->packet->pts != AV_NOPTS_VALUE) {
LOGE("The first audio packet is not a config packet");
goto end;
@@ -312,7 +304,7 @@ sc_recorder_process_header(struct sc_recorder *recorder) {
end:
sc_record_packet_delete(video_pkt);
if (audio_pkt) {
if (recorder->audio) {
sc_record_packet_delete(audio_pkt);
}
@@ -361,6 +353,12 @@ sc_recorder_process_packets(struct sc_recorder *recorder) {
assert(recorder->audio
|| (!audio_pkt && sc_queue_is_empty(&recorder->audio_queue)));
if (recorder->stopped && sc_queue_is_empty(&recorder->video_queue)
&& sc_queue_is_empty(&recorder->audio_queue)) {
sc_mutex_unlock(&recorder->mutex);
break;
}
if (!video_pkt && !sc_queue_is_empty(&recorder->video_queue)) {
sc_queue_take(&recorder->video_queue, next, &video_pkt);
}
@@ -369,13 +367,6 @@ sc_recorder_process_packets(struct sc_recorder *recorder) {
sc_queue_take(&recorder->audio_queue, next, &audio_pkt);
}
if (recorder->stopped && !video_pkt && !audio_pkt) {
assert(sc_queue_is_empty(&recorder->video_queue));
assert(sc_queue_is_empty(&recorder->audio_queue));
sc_mutex_unlock(&recorder->mutex);
break;
}
assert(video_pkt || audio_pkt); // at least one
sc_mutex_unlock(&recorder->mutex);
@@ -400,21 +391,6 @@ sc_recorder_process_packets(struct sc_recorder *recorder) {
} else if (video_pkt && audio_pkt) {
pts_origin =
MIN(video_pkt->packet->pts, audio_pkt->packet->pts);
} else if (recorder->stopped) {
if (video_pkt) {
// The recorder is stopped without audio, record the video
// packets
pts_origin = video_pkt->packet->pts;
} else {
// Fail if there is no video
error = true;
goto end;
}
// If the recorder is stopped while one of the streams has no
// packets, then we must avoid a live-loop and correctly record
// the stream having packets.
pts_origin = video_pkt ? video_pkt->packet->pts
: audio_pkt->packet->pts;
} else {
// We need both video and audio packets to initialize pts_origin
continue;
@@ -437,8 +413,8 @@ sc_recorder_process_packets(struct sc_recorder *recorder) {
sc_record_packet_delete(video_pkt_previous);
if (!ok) {
LOGE("Could not record video packet");
error = true;
goto end;
// TODO cleanup
return false;
}
}
@@ -451,13 +427,13 @@ sc_recorder_process_packets(struct sc_recorder *recorder) {
audio_pkt->packet->dts = audio_pkt->packet->pts;
bool ok = sc_recorder_write_audio(recorder, audio_pkt->packet);
sc_record_packet_delete(audio_pkt);
if (!ok) {
LOGE("Could not record audio packet");
error = true;
goto end;
}
sc_record_packet_delete(audio_pkt);
audio_pkt = NULL;
}
}
@@ -635,6 +611,7 @@ sc_recorder_audio_packet_sink_close(struct sc_packet_sink *sink) {
sc_mutex_lock(&recorder->mutex);
// EOS also stops the recorder
// TODO must stop only once both video and audio stream are complete
recorder->stopped = true;
sc_cond_signal(&recorder->queue_cond);
sc_mutex_unlock(&recorder->mutex);
@@ -680,7 +657,7 @@ sc_recorder_audio_packet_sink_disable(struct sc_packet_sink *sink) {
assert(!recorder->audio_disabled);
assert(!recorder->audio_codec);
LOGW("Audio stream recording disabled");
LOGW("Audio stream disabled: it could not be captured by the device");
sc_mutex_lock(&recorder->mutex);
recorder->audio_disabled = true;
@@ -731,6 +708,7 @@ sc_recorder_init(struct sc_recorder *recorder, const char *filename,
recorder->declared_frame_size = declared_frame_size;
assert(cbs && cbs->on_ended);
recorder->cbs = cbs;
recorder->cbs_userdata = cbs_userdata;
@@ -753,8 +731,17 @@ sc_recorder_init(struct sc_recorder *recorder, const char *filename,
recorder->audio_packet_sink.ops = &audio_ops;
}
ok = sc_thread_create(&recorder->thread, run_recorder, "scrcpy-recorder",
recorder);
if (!ok) {
LOGE("Could not start recorder thread");
goto error_stream_cond_destroy;
}
return true;
error_stream_cond_destroy:
sc_cond_destroy(&recorder->stream_cond);
error_queue_cond_destroy:
sc_cond_destroy(&recorder->queue_cond);
error_mutex_destroy:
@@ -765,18 +752,6 @@ error_free_filename:
return false;
}
bool
sc_recorder_start(struct sc_recorder *recorder) {
bool ok = sc_thread_create(&recorder->thread, run_recorder,
"scrcpy-recorder", recorder);
if (!ok) {
LOGE("Could not start recorder thread");
return false;
}
return true;
}
void
sc_recorder_stop(struct sc_recorder *recorder) {
sc_mutex_lock(&recorder->mutex);

View File

@@ -72,9 +72,6 @@ sc_recorder_init(struct sc_recorder *recorder, const char *filename,
struct sc_size declared_frame_size,
const struct sc_recorder_callbacks *cbs, void *cbs_userdata);
bool
sc_recorder_start(struct sc_recorder *recorder);
void
sc_recorder_stop(struct sc_recorder *recorder);

View File

@@ -13,7 +13,6 @@
# include <windows.h>
#endif
#include "audio_player.h"
#include "controller.h"
#include "decoder.h"
#include "demuxer.h"
@@ -41,11 +40,9 @@
struct scrcpy {
struct sc_server server;
struct sc_screen screen;
struct sc_audio_player audio_player;
struct sc_demuxer video_demuxer;
struct sc_demuxer audio_demuxer;
struct sc_decoder video_decoder;
struct sc_decoder audio_decoder;
struct sc_decoder decoder;
struct sc_recorder recorder;
#ifdef HAVE_V4L2
struct sc_v4l2_sink v4l2_sink;
@@ -186,16 +183,15 @@ await_for_server(bool *connected) {
while (SDL_WaitEvent(&event)) {
switch (event.type) {
case SDL_QUIT:
if (connected) {
*connected = false;
}
LOGD("User requested to quit");
*connected = false;
return true;
case SC_EVENT_SERVER_CONNECTION_FAILED:
LOGE("Server connection failed");
return false;
case SC_EVENT_SERVER_CONNECTED:
if (connected) {
*connected = true;
}
LOGD("Server connected");
*connected = true;
return true;
default:
break;
@@ -206,6 +202,43 @@ await_for_server(bool *connected) {
return false;
}
static SDL_LogPriority
sdl_priority_from_av_level(int level) {
switch (level) {
case AV_LOG_PANIC:
case AV_LOG_FATAL:
return SDL_LOG_PRIORITY_CRITICAL;
case AV_LOG_ERROR:
return SDL_LOG_PRIORITY_ERROR;
case AV_LOG_WARNING:
return SDL_LOG_PRIORITY_WARN;
case AV_LOG_INFO:
return SDL_LOG_PRIORITY_INFO;
}
// do not forward others, which are too verbose
return 0;
}
static void
av_log_callback(void *avcl, int level, const char *fmt, va_list vl) {
(void) avcl;
SDL_LogPriority priority = sdl_priority_from_av_level(level);
if (priority == 0) {
return;
}
size_t fmt_len = strlen(fmt);
char *local_fmt = malloc(fmt_len + 10);
if (!local_fmt) {
LOG_OOM();
return;
}
memcpy(local_fmt, "[FFmpeg] ", 9); // do not write the final '\0'
memcpy(local_fmt + 9, fmt, fmt_len + 1); // include '\0'
SDL_LogMessageV(SDL_LOG_CATEGORY_VIDEO, priority, local_fmt, vl);
free(local_fmt);
}
static void
sc_recorder_on_ended(struct sc_recorder *recorder, bool success,
void *userdata) {
@@ -234,19 +267,10 @@ static void
sc_audio_demuxer_on_ended(struct sc_demuxer *demuxer, bool eos,
void *userdata) {
(void) demuxer;
(void) eos;
(void) userdata;
// Contrary to the video demuxer, keep mirroring if only the audio fails.
// 'eos' is true on end-of-stream, including when audio capture is not
// possible on the device (so that scrcpy continue to mirror video without
// failing).
// However, if an audio configuration failure occurs (for example the user
// explicitly selected an unknown audio encoder), 'eos' is false and scrcpy
// must exit.
if (!eos) {
PUSH_EVENT(SC_EVENT_DEMUXER_ERROR);
}
// Contrary to the video demuxer, keep mirroring if only the audio fails
}
static void
@@ -302,7 +326,6 @@ scrcpy(struct scrcpy_options *options) {
bool server_started = false;
bool file_pusher_initialized = false;
bool recorder_initialized = false;
bool recorder_started = false;
#ifdef HAVE_V4L2
bool v4l2_sink_initialized = false;
#endif
@@ -327,14 +350,14 @@ scrcpy(struct scrcpy_options *options) {
.select_usb = options->select_usb,
.select_tcpip = options->select_tcpip,
.log_level = options->log_level,
.video_codec = options->video_codec,
.codec = options->codec,
.audio_codec = options->audio_codec,
.crop = options->crop,
.port_range = options->port_range,
.tunnel_host = options->tunnel_host,
.tunnel_port = options->tunnel_port,
.max_size = options->max_size,
.video_bit_rate = options->video_bit_rate,
.bit_rate = options->bit_rate,
.audio_bit_rate = options->audio_bit_rate,
.max_fps = options->max_fps,
.lock_video_orientation = options->lock_video_orientation,
@@ -343,10 +366,8 @@ scrcpy(struct scrcpy_options *options) {
.audio = options->audio,
.show_touches = options->show_touches,
.stay_awake = options->stay_awake,
.video_codec_options = options->video_codec_options,
.audio_codec_options = options->audio_codec_options,
.video_encoder = options->video_encoder,
.audio_encoder = options->audio_encoder,
.codec_options = options->codec_options,
.encoder_name = options->encoder_name,
.force_adb_forward = options->force_adb_forward,
.power_off_on_close = options->power_off_on_close,
.clipboard_autosync = options->clipboard_autosync,
@@ -355,8 +376,6 @@ scrcpy(struct scrcpy_options *options) {
.tcpip_dst = options->tcpip_dst,
.cleanup = options->cleanup,
.power_on = options->power_on,
.list_encoders = options->list_encoders,
.list_displays = options->list_displays,
};
static const struct sc_server_callbacks cbs = {
@@ -374,27 +393,14 @@ scrcpy(struct scrcpy_options *options) {
server_started = true;
if (options->list_encoders || options->list_displays) {
bool ok = await_for_server(NULL);
ret = ok ? SCRCPY_EXIT_SUCCESS : SCRCPY_EXIT_FAILURE;
goto end;
}
if (options->display) {
sdl_set_hints(options->render_driver);
}
// Initialize SDL video in addition if display is enabled
if (options->display) {
if (SDL_Init(SDL_INIT_VIDEO)) {
LOGE("Could not initialize SDL video: %s", SDL_GetError());
goto end;
}
if (options->audio && SDL_Init(SDL_INIT_AUDIO)) {
LOGE("Could not initialize SDL audio: %s", SDL_GetError());
goto end;
}
if (options->display && SDL_Init(SDL_INIT_VIDEO)) {
LOGE("Could not initialize SDL: %s", SDL_GetError());
goto end;
}
sdl_configure(options->display, options->disable_screensaver);
@@ -402,19 +408,15 @@ scrcpy(struct scrcpy_options *options) {
// Await for server without blocking Ctrl+C handling
bool connected;
if (!await_for_server(&connected)) {
LOGE("Server connection failed");
goto end;
}
if (!connected) {
// This is not an error, user requested to quit
LOGD("User requested to quit");
ret = SCRCPY_EXIT_SUCCESS;
goto end;
}
LOGD("Server connected");
// It is necessarily initialized here, since the device is connected
struct sc_server_info *info = &s->server.info;
@@ -432,34 +434,17 @@ scrcpy(struct scrcpy_options *options) {
file_pusher_initialized = true;
}
static const struct sc_demuxer_callbacks video_demuxer_cbs = {
.on_ended = sc_video_demuxer_on_ended,
};
sc_demuxer_init(&s->video_demuxer, "video", s->server.video_socket,
&video_demuxer_cbs, NULL);
if (options->audio) {
static const struct sc_demuxer_callbacks audio_demuxer_cbs = {
.on_ended = sc_audio_demuxer_on_ended,
};
sc_demuxer_init(&s->audio_demuxer, "audio", s->server.audio_socket,
&audio_demuxer_cbs, NULL);
}
bool needs_video_decoder = options->display;
bool needs_audio_decoder = options->audio && options->display;
struct sc_decoder *dec = NULL;
bool needs_decoder = options->display;
#ifdef HAVE_V4L2
needs_video_decoder |= !!options->v4l2_device;
needs_decoder |= !!options->v4l2_device;
#endif
if (needs_video_decoder) {
sc_decoder_init(&s->video_decoder, "video");
sc_demuxer_add_sink(&s->video_demuxer, &s->video_decoder.packet_sink);
}
if (needs_audio_decoder) {
sc_decoder_init(&s->audio_decoder, "audio");
sc_demuxer_add_sink(&s->audio_demuxer, &s->audio_decoder.packet_sink);
if (needs_decoder) {
sc_decoder_init(&s->decoder);
dec = &s->decoder;
}
struct sc_recorder *rec = NULL;
if (options->record_filename) {
static const struct sc_recorder_callbacks recorder_cbs = {
.on_ended = sc_recorder_on_ended,
@@ -469,17 +454,34 @@ scrcpy(struct scrcpy_options *options) {
info->frame_size, &recorder_cbs, NULL)) {
goto end;
}
rec = &s->recorder;
recorder_initialized = true;
}
if (!sc_recorder_start(&s->recorder)) {
goto end;
}
recorder_started = true;
av_log_set_callback(av_log_callback);
sc_demuxer_add_sink(&s->video_demuxer, &s->recorder.video_packet_sink);
static const struct sc_demuxer_callbacks video_demuxer_cbs = {
.on_ended = sc_video_demuxer_on_ended,
};
sc_demuxer_init(&s->video_demuxer, s->server.video_socket,
&video_demuxer_cbs, NULL);
if (options->audio) {
static const struct sc_demuxer_callbacks audio_demuxer_cbs = {
.on_ended = sc_audio_demuxer_on_ended,
};
sc_demuxer_init(&s->audio_demuxer, s->server.audio_socket,
&audio_demuxer_cbs, NULL);
}
if (dec) {
sc_demuxer_add_sink(&s->video_demuxer, &dec->packet_sink);
}
if (rec) {
sc_demuxer_add_sink(&s->video_demuxer, &rec->video_packet_sink);
if (options->audio) {
sc_demuxer_add_sink(&s->audio_demuxer,
&s->recorder.audio_packet_sink);
sc_demuxer_add_sink(&s->audio_demuxer, &rec->audio_packet_sink);
}
}
@@ -671,12 +673,7 @@ aoa_hid_end:
}
screen_initialized = true;
sc_decoder_add_sink(&s->video_decoder, &s->screen.frame_sink);
if (options->audio) {
sc_audio_player_init(&s->audio_player);
sc_decoder_add_sink(&s->audio_decoder, &s->audio_player.frame_sink);
}
sc_decoder_add_sink(&s->decoder, &s->screen.frame_sink);
}
#ifdef HAVE_V4L2
@@ -686,7 +683,7 @@ aoa_hid_end:
goto end;
}
sc_decoder_add_sink(&s->video_decoder, &s->v4l2_sink.frame_sink);
sc_decoder_add_sink(&s->decoder, &s->v4l2_sink.frame_sink);
v4l2_sink_initialized = true;
}
@@ -790,10 +787,8 @@ end:
sc_controller_destroy(&s->controller);
}
if (recorder_started) {
sc_recorder_join(&s->recorder);
}
if (recorder_initialized) {
sc_recorder_join(&s->recorder);
sc_recorder_destroy(&s->recorder);
}
@@ -802,10 +797,6 @@ end:
sc_file_pusher_destroy(&s->file_pusher);
}
if (server_started) {
sc_server_join(&s->server);
}
sc_server_destroy(&s->server);
return ret;

View File

@@ -330,11 +330,7 @@ event_watcher(void *data, SDL_Event *event) {
#endif
static bool
sc_screen_frame_sink_open(struct sc_frame_sink *sink,
const AVCodecContext *ctx) {
assert(ctx->pix_fmt == AV_PIX_FMT_YUV420P);
(void) ctx;
sc_screen_frame_sink_open(struct sc_frame_sink *sink) {
struct sc_screen *screen = DOWNCAST(sink);
(void) screen;
#ifndef NDEBUG

View File

@@ -71,10 +71,8 @@ sc_server_params_destroy(struct sc_server_params *params) {
// The server stores a copy of the params provided by the user
free((char *) params->req_serial);
free((char *) params->crop);
free((char *) params->video_codec_options);
free((char *) params->audio_codec_options);
free((char *) params->video_encoder);
free((char *) params->audio_encoder);
free((char *) params->codec_options);
free((char *) params->encoder_name);
free((char *) params->tcpip_dst);
}
@@ -97,10 +95,8 @@ sc_server_params_copy(struct sc_server_params *dst,
COPY(req_serial);
COPY(crop);
COPY(video_codec_options);
COPY(audio_codec_options);
COPY(video_encoder);
COPY(audio_encoder);
COPY(codec_options);
COPY(encoder_name);
COPY(tcpip_dst);
#undef COPY
@@ -224,17 +220,16 @@ execute_server(struct sc_server *server,
ADD_PARAM("scid=%08x", params->scid);
ADD_PARAM("log_level=%s", log_level_to_server_string(params->log_level));
if (params->video_bit_rate) {
ADD_PARAM("video_bit_rate=%" PRIu32, params->video_bit_rate);
if (params->bit_rate) {
ADD_PARAM("bit_rate=%" PRIu32, params->bit_rate);
}
if (!params->audio) {
ADD_PARAM("audio=false");
} else if (params->audio_bit_rate) {
ADD_PARAM("audio_bit_rate=%" PRIu32, params->audio_bit_rate);
}
if (params->video_codec != SC_CODEC_H264) {
ADD_PARAM("video_codec=%s",
sc_server_get_codec_name(params->video_codec));
if (params->codec != SC_CODEC_H264) {
ADD_PARAM("codec=%s", sc_server_get_codec_name(params->codec));
}
if (params->audio_codec != SC_CODEC_OPUS) {
ADD_PARAM("audio_codec=%s",
@@ -269,17 +264,11 @@ execute_server(struct sc_server *server,
if (params->stay_awake) {
ADD_PARAM("stay_awake=true");
}
if (params->video_codec_options) {
ADD_PARAM("video_codec_options=%s", params->video_codec_options);
if (params->codec_options) {
ADD_PARAM("codec_options=%s", params->codec_options);
}
if (params->audio_codec_options) {
ADD_PARAM("audio_codec_options=%s", params->audio_codec_options);
}
if (params->video_encoder) {
ADD_PARAM("video_encoder=%s", params->video_encoder);
}
if (params->audio_encoder) {
ADD_PARAM("audio_encoder=%s", params->audio_encoder);
if (params->encoder_name) {
ADD_PARAM("encoder_name=%s", params->encoder_name);
}
if (params->power_off_on_close) {
ADD_PARAM("power_off_on_close=true");
@@ -300,12 +289,6 @@ execute_server(struct sc_server *server,
// By default, power_on is true
ADD_PARAM("power_on=false");
}
if (params->list_encoders) {
ADD_PARAM("list_encoders=true");
}
if (params->list_displays) {
ADD_PARAM("list_displays=true");
}
#undef ADD_PARAM
@@ -849,25 +832,6 @@ run_server(void *data) {
assert(serial);
LOGD("Device serial: %s", serial);
ok = push_server(&server->intr, serial);
if (!ok) {
goto error_connection_failed;
}
// If --list-* is passed, then the server just prints the requested data
// then exits.
if (params->list_encoders || params->list_displays) {
sc_pid pid = execute_server(server, params);
if (pid == SC_PROCESS_NONE) {
goto error_connection_failed;
}
sc_process_wait(pid, NULL); // ignore exit code
sc_process_close(pid);
// Wake up await_for_server()
server->cbs->on_connected(server, server->cbs_userdata);
return 0;
}
int r = asprintf(&server->device_socket_name, SC_SOCKET_NAME_PREFIX "%08x",
params->scid);
if (r == -1) {
@@ -877,6 +841,11 @@ run_server(void *data) {
assert(r == sizeof(SC_SOCKET_NAME_PREFIX) - 1 + 8);
assert(server->device_socket_name);
ok = push_server(&server->intr, serial);
if (!ok) {
goto error_connection_failed;
}
ok = sc_adb_tunnel_open(&server->tunnel, &server->intr, serial,
server->device_socket_name, params->port_range,
params->force_adb_forward);
@@ -986,10 +955,7 @@ sc_server_stop(struct sc_server *server) {
sc_cond_signal(&server->cond_stopped);
sc_intr_interrupt(&server->intr);
sc_mutex_unlock(&server->mutex);
}
void
sc_server_join(struct sc_server *server) {
sc_thread_join(&server->thread, NULL);
}

View File

@@ -25,18 +25,16 @@ struct sc_server_params {
uint32_t scid;
const char *req_serial;
enum sc_log_level log_level;
enum sc_codec video_codec;
enum sc_codec codec;
enum sc_codec audio_codec;
const char *crop;
const char *video_codec_options;
const char *audio_codec_options;
const char *video_encoder;
const char *audio_encoder;
const char *codec_options;
const char *encoder_name;
struct sc_port_range port_range;
uint32_t tunnel_host;
uint16_t tunnel_port;
uint16_t max_size;
uint32_t video_bit_rate;
uint32_t bit_rate;
uint32_t audio_bit_rate;
uint16_t max_fps;
int8_t lock_video_orientation;
@@ -55,8 +53,6 @@ struct sc_server_params {
bool select_tcpip;
bool cleanup;
bool power_on;
bool list_encoders;
bool list_displays;
};
struct sc_server {
@@ -116,10 +112,6 @@ sc_server_start(struct sc_server *server);
void
sc_server_stop(struct sc_server *server);
// join the server thread
void
sc_server_join(struct sc_server *server);
// close and release sockets
void
sc_server_destroy(struct sc_server *server);

View File

@@ -5,7 +5,6 @@
#include <assert.h>
#include <stdbool.h>
#include <libavcodec/avcodec.h>
typedef struct AVFrame AVFrame;
@@ -19,7 +18,7 @@ struct sc_frame_sink {
};
struct sc_frame_sink_ops {
bool (*open)(struct sc_frame_sink *sink, const AVCodecContext *ctx);
bool (*open)(struct sc_frame_sink *sink);
void (*close)(struct sc_frame_sink *sink);
bool (*push)(struct sc_frame_sink *sink, const AVFrame *frame);
};

View File

@@ -1,116 +0,0 @@
#include "bytebuf.h"
#include <assert.h>
#include <stdlib.h>
#include <string.h>
#include "util/log.h"
bool
sc_bytebuf_init(struct sc_bytebuf *buf, size_t alloc_size) {
assert(alloc_size);
// sufficient, but use more for alignment.
buf->data = malloc(alloc_size);
if (!buf->data) {
LOG_OOM();
return false;
}
buf->alloc_size = alloc_size;
buf->head = 0;
buf->tail = 0;
return true;
}
void
sc_bytebuf_destroy(struct sc_bytebuf *buf) {
free(buf->data);
}
void
sc_bytebuf_read(struct sc_bytebuf *buf, uint8_t *to, size_t len) {
assert(len);
assert(sc_bytebuf_read_remaining(buf) >= len);
assert(buf->tail != buf->head); // the buffer could not be empty
size_t right_limit = buf->tail < buf->head ? buf->head : buf->alloc_size;
size_t right_len = right_limit - buf->tail;
if (len < right_len) {
right_len = len;
}
memcpy(to, buf->data + buf->tail, right_len);
if (len > right_len) {
memcpy(to + right_len, buf->data, len - right_len);
}
buf->tail = (buf->tail + len) % buf->alloc_size;
}
void
sc_bytebuf_skip(struct sc_bytebuf *buf, size_t len) {
assert(len);
assert(sc_bytebuf_read_remaining(buf) >= len);
assert(buf->tail != buf->head); // the buffer could not be empty
buf->tail = (buf->tail + len) % buf->alloc_size;
}
void
sc_bytebuf_write(struct sc_bytebuf *buf, const uint8_t *from, size_t len) {
assert(len);
size_t max_len = buf->alloc_size - 1;
if (len >= max_len) {
// Copy only the right-most bytes
memcpy(buf->data, from + len - max_len, max_len);
buf->tail = 0;
buf->head = max_len;
return;
}
size_t right_limit = buf->head < buf->tail ? buf->tail : buf->alloc_size;
size_t right_len = right_limit - buf->head;
if (len < right_len) {
right_len = len;
}
memcpy(buf->data + buf->head, from, right_len);
if (len > right_len) {
memcpy(buf->data, from + right_len, len - right_len);
}
size_t empty_space = sc_bytebuf_write_remaining(buf);
if (len > empty_space) {
buf->tail = (buf->tail + len - empty_space) % buf->alloc_size;
}
buf->head = (buf->head + len) % buf->alloc_size;
}
void
sc_bytebuf_prepare_write(struct sc_bytebuf *buf, const uint8_t *from,
size_t len) {
// *This function MUST NOT access buf->tail (even in assert()).*
// The purpose of this function is to allow a reader and a writer to access
// different parts of the buffer in parallel simultaneously. It is intended
// to be called without lock (only sc_bytebuf_commit_write() is intended to
// be called with lock held).
assert(len < buf->alloc_size - 1);
size_t right_len = buf->alloc_size - buf->head;
if (len < right_len) {
right_len = len;
}
memcpy(buf->data + buf->head, from, right_len);
if (len > right_len) {
memcpy(buf->data, from + right_len, len - right_len);
}
}
void
sc_bytebuf_commit_write(struct sc_bytebuf *buf, size_t len) {
assert(len <= sc_bytebuf_write_remaining(buf));
buf->head = (buf->head + len) % buf->alloc_size;
}

View File

@@ -1,104 +0,0 @@
#ifndef SC_BYTEBUF_H
#define SC_BYTEBUF_H
#include "common.h"
#include <stdbool.h>
#include <stdint.h>
struct sc_bytebuf {
uint8_t *data;
// The actual capacity is (allocated - 1) so that head == tail is
// non-ambiguous
size_t alloc_size;
size_t head;
size_t tail;
// empty: tail == head
// full: (tail + 1) % allocated == head
};
bool
sc_bytebuf_init(struct sc_bytebuf *buf, size_t alloc_size);
/**
* Copy from the bytebuf to a user-provided array
*
* The caller must check that len <= buf->len (it is an error to attempt to read
* more bytes than available).
*
* This function is guaranteed to not change the head.
*/
void
sc_bytebuf_read(struct sc_bytebuf *buf, uint8_t *to, size_t len);
/**
* Drop len bytes from the buffer
*
* The caller must check that len <= buf->len (it is an error to attempt to skip
* more bytes than available).
*
* This function is guaranteed to not change the head.
*
* It is equivalent to call sc_bytebuf_read() to some array and discard the
* array (but more efficient since there is no copy).
*/
void
sc_bytebuf_skip(struct sc_bytebuf *buf, size_t len);
/**
* Copy the user-provided array to the bytebuf
*
* The length of the input array is not restricted:
* if len >= sc_bytebuf_write_remaining(buf), then the excessive input bytes
* will overwrite the oldest bytes in the buffer.
*/
void
sc_bytebuf_write(struct sc_bytebuf *buf, const uint8_t *from, size_t len);
/**
* Copy the user-provided array to the bytebuf, but do not advance the cursor
*
* The caller must check that len <= buf->len (it is an error to attempt to
* write more bytes than available).
*
* After this function is called, the write must be committed with
* sc_bytebuf_commit_write().
*
* The purpose of this mechanism is to acquire a lock only to commit the write,
* but not to perform the actual copy.
*/
void
sc_bytebuf_prepare_write(struct sc_bytebuf *buf, const uint8_t *from,
size_t len);
/**
* Commit a prepared write
*/
void
sc_bytebuf_commit_write(struct sc_bytebuf *buf, size_t len);
/**
* Return the number of bytes which can be read
*
* It is an error to read more bytes than available.
*/
static inline size_t
sc_bytebuf_read_remaining(struct sc_bytebuf *buf) {
return (buf->alloc_size + buf->head - buf->tail) % buf->alloc_size;
}
/**
* Return the number of bytes which can be written without overwriting
*
* It is not an error to write more bytes than the available space, but this
* would overwrite the oldest bytes in the buffer.
*/
static inline size_t
sc_bytebuf_write_remaining(struct sc_bytebuf *buf) {
return (buf->alloc_size + buf->tail - buf->head - 1) % buf->alloc_size;
}
void
sc_bytebuf_destroy(struct sc_bytebuf *buf);
#endif

View File

@@ -4,7 +4,6 @@
# include <windows.h>
#endif
#include <assert.h>
#include <libavformat/avformat.h>
static SDL_LogPriority
log_level_sc_to_sdl(enum sc_log_level level) {
@@ -48,7 +47,6 @@ void
sc_set_log_level(enum sc_log_level level) {
SDL_LogPriority sdl_log = log_level_sc_to_sdl(level);
SDL_LogSetPriority(SDL_LOG_CATEGORY_APPLICATION, sdl_log);
SDL_LogSetPriority(SDL_LOG_CATEGORY_CUSTOM, sdl_log);
}
enum sc_log_level
@@ -87,46 +85,3 @@ sc_log_windows_error(const char *prefix, int error) {
return true;
}
#endif
static SDL_LogPriority
sdl_priority_from_av_level(int level) {
switch (level) {
case AV_LOG_PANIC:
case AV_LOG_FATAL:
return SDL_LOG_PRIORITY_CRITICAL;
case AV_LOG_ERROR:
return SDL_LOG_PRIORITY_ERROR;
case AV_LOG_WARNING:
return SDL_LOG_PRIORITY_WARN;
case AV_LOG_INFO:
return SDL_LOG_PRIORITY_INFO;
}
// do not forward others, which are too verbose
return 0;
}
static void
sc_av_log_callback(void *avcl, int level, const char *fmt, va_list vl) {
(void) avcl;
SDL_LogPriority priority = sdl_priority_from_av_level(level);
if (priority == 0) {
return;
}
size_t fmt_len = strlen(fmt);
char *local_fmt = malloc(fmt_len + 10);
if (!local_fmt) {
LOG_OOM();
return;
}
memcpy(local_fmt, "[FFmpeg] ", 9); // do not write the final '\0'
memcpy(local_fmt + 9, fmt, fmt_len + 1); // include '\0'
SDL_LogMessageV(SDL_LOG_CATEGORY_CUSTOM, priority, local_fmt, vl);
free(local_fmt);
}
void
sc_log_configure() {
// Redirect FFmpeg logs to SDL logs
av_log_set_callback(sc_av_log_callback);
}

View File

@@ -35,7 +35,4 @@ bool
sc_log_windows_error(const char *prefix, int error);
#endif
void
sc_log_configure();
#endif

View File

@@ -30,8 +30,8 @@ bool
net_init(void) {
#ifdef _WIN32
WSADATA wsa;
int res = WSAStartup(MAKEWORD(1, 1), &wsa);
if (res) {
int res = WSAStartup(MAKEWORD(2, 2), &wsa) < 0;
if (res < 0) {
LOGE("WSAStartup failed with error %d", res);
return false;
}

View File

@@ -156,10 +156,7 @@ sc_video_buffer_on_new_frame(struct sc_video_buffer *vb, bool previous_skipped,
}
static bool
sc_v4l2_sink_open(struct sc_v4l2_sink *vs, const AVCodecContext *ctx) {
assert(ctx->pix_fmt == AV_PIX_FMT_YUV420P);
(void) ctx;
sc_v4l2_sink_open(struct sc_v4l2_sink *vs) {
static const struct sc_video_buffer_callbacks cbs = {
.on_new_frame = sc_video_buffer_on_new_frame,
};
@@ -339,9 +336,9 @@ sc_v4l2_sink_push(struct sc_v4l2_sink *vs, const AVFrame *frame) {
}
static bool
sc_v4l2_frame_sink_open(struct sc_frame_sink *sink, const AVCodecContext *ctx) {
sc_v4l2_frame_sink_open(struct sc_frame_sink *sink) {
struct sc_v4l2_sink *vs = DOWNCAST(sink);
return sc_v4l2_sink_open(vs, ctx);
return sc_v4l2_sink_open(vs);
}
static void

View File

@@ -1,154 +0,0 @@
#include "common.h"
#include <assert.h>
#include <string.h>
#include "util/bytebuf.h"
void test_bytebuf_simple(void) {
struct sc_bytebuf buf;
uint8_t data[20];
bool ok = sc_bytebuf_init(&buf, 20);
assert(ok);
sc_bytebuf_write(&buf, (uint8_t *) "hello", sizeof("hello") - 1);
assert(sc_bytebuf_read_remaining(&buf) == 5);
sc_bytebuf_read(&buf, data, 4);
assert(!strncmp((char *) data, "hell", 4));
sc_bytebuf_write(&buf, (uint8_t *) " world", sizeof(" world") - 1);
assert(sc_bytebuf_read_remaining(&buf) == 7);
sc_bytebuf_write(&buf, (uint8_t *) "!", 1);
assert(sc_bytebuf_read_remaining(&buf) == 8);
sc_bytebuf_read(&buf, &data[4], 8);
assert(sc_bytebuf_read_remaining(&buf) == 0);
data[12] = '\0';
assert(!strcmp((char *) data, "hello world!"));
assert(sc_bytebuf_read_remaining(&buf) == 0);
sc_bytebuf_destroy(&buf);
}
void test_bytebuf_boundaries(void) {
struct sc_bytebuf buf;
uint8_t data[20];
bool ok = sc_bytebuf_init(&buf, 20);
assert(ok);
sc_bytebuf_write(&buf, (uint8_t *) "hello ", sizeof("hello ") - 1);
assert(sc_bytebuf_read_remaining(&buf) == 6);
sc_bytebuf_write(&buf, (uint8_t *) "hello ", sizeof("hello ") - 1);
assert(sc_bytebuf_read_remaining(&buf) == 12);
sc_bytebuf_write(&buf, (uint8_t *) "hello ", sizeof("hello ") - 1);
assert(sc_bytebuf_read_remaining(&buf) == 18);
sc_bytebuf_read(&buf, data, 9);
assert(!strncmp((char *) data, "hello hel", 9));
assert(sc_bytebuf_read_remaining(&buf) == 9);
sc_bytebuf_write(&buf, (uint8_t *) "world", sizeof("world") - 1);
assert(sc_bytebuf_read_remaining(&buf) == 14);
sc_bytebuf_write(&buf, (uint8_t *) "!", 1);
assert(sc_bytebuf_read_remaining(&buf) == 15);
sc_bytebuf_skip(&buf, 3);
assert(sc_bytebuf_read_remaining(&buf) == 12);
sc_bytebuf_read(&buf, data, 12);
data[12] = '\0';
assert(!strcmp((char *) data, "hello world!"));
assert(sc_bytebuf_read_remaining(&buf) == 0);
sc_bytebuf_destroy(&buf);
}
void test_bytebuf_overwrite(void) {
struct sc_bytebuf buf;
uint8_t data[10];
bool ok = sc_bytebuf_init(&buf, 10); // so actual capacity is 9
assert(ok);
sc_bytebuf_write(&buf, (uint8_t *) "hello ", sizeof("hello ") - 1);
assert(sc_bytebuf_read_remaining(&buf) == 6);
sc_bytebuf_write(&buf, (uint8_t *) "abcdef", sizeof("abcdef") - 1);
assert(sc_bytebuf_read_remaining(&buf) == 9);
sc_bytebuf_read(&buf, data, 9);
assert(!strncmp((char *) data, "lo abcdef", 9));
sc_bytebuf_write(&buf, (uint8_t *) "a very big buffer",
sizeof("a very big buffer") - 1);
assert(sc_bytebuf_read_remaining(&buf) == 9);
sc_bytebuf_read(&buf, data, 9);
assert(!strncmp((char *) data, "ig buffer", 9));
assert(sc_bytebuf_read_remaining(&buf) == 0);
sc_bytebuf_destroy(&buf);
}
void test_bytebuf_two_steps_write(void) {
struct sc_bytebuf buf;
uint8_t data[20];
bool ok = sc_bytebuf_init(&buf, 20);
assert(ok);
sc_bytebuf_write(&buf, (uint8_t *) "hello ", sizeof("hello ") - 1);
assert(sc_bytebuf_read_remaining(&buf) == 6);
sc_bytebuf_write(&buf, (uint8_t *) "hello ", sizeof("hello ") - 1);
assert(sc_bytebuf_read_remaining(&buf) == 12);
sc_bytebuf_prepare_write(&buf, (uint8_t *) "hello ", sizeof("hello ") - 1);
assert(sc_bytebuf_read_remaining(&buf) == 12); // write not committed yet
sc_bytebuf_read(&buf, data, 9);
assert(!strncmp((char *) data, "hello hel", 3));
assert(sc_bytebuf_read_remaining(&buf) == 3);
sc_bytebuf_commit_write(&buf, sizeof("hello ") - 1);
assert(sc_bytebuf_read_remaining(&buf) == 9);
sc_bytebuf_prepare_write(&buf, (uint8_t *) "world", sizeof("world") - 1);
assert(sc_bytebuf_read_remaining(&buf) == 9); // write not committed yet
sc_bytebuf_commit_write(&buf, sizeof("world") - 1);
assert(sc_bytebuf_read_remaining(&buf) == 14);
sc_bytebuf_write(&buf, (uint8_t *) "!", 1);
assert(sc_bytebuf_read_remaining(&buf) == 15);
sc_bytebuf_skip(&buf, 3);
assert(sc_bytebuf_read_remaining(&buf) == 12);
sc_bytebuf_read(&buf, data, 12);
data[12] = '\0';
assert(!strcmp((char *) data, "hello world!"));
assert(sc_bytebuf_read_remaining(&buf) == 0);
sc_bytebuf_destroy(&buf);
}
int main(int argc, char *argv[]) {
(void) argc;
(void) argv;
test_bytebuf_simple();
test_bytebuf_boundaries();
test_bytebuf_overwrite();
test_bytebuf_two_steps_write();
return 0;
}

View File

@@ -46,7 +46,7 @@ static void test_options(void) {
char *argv[] = {
"scrcpy",
"--always-on-top",
"--video-bit-rate", "5M",
"--bit-rate", "5M",
"--crop", "100:200:300:400",
"--fullscreen",
"--max-fps", "30",
@@ -75,7 +75,7 @@ static void test_options(void) {
const struct scrcpy_options *opts = &args.opts;
assert(opts->always_on_top);
assert(opts->video_bit_rate == 5000000);
assert(opts->bit_rate == 5000000);
assert(!strcmp(opts->crop, "100:200:300:400"));
assert(opts->fullscreen);
assert(opts->max_fps == 30);

View File

@@ -7,7 +7,7 @@ buildscript {
mavenCentral()
}
dependencies {
classpath 'com.android.tools.build:gradle:7.4.0'
classpath 'com.android.tools.build:gradle:7.2.2'
// NOTE: Do not place your application dependencies here; they belong
// in the individual module build.gradle files

View File

@@ -19,7 +19,6 @@ endian = 'little'
ffmpeg_avcodec = 'avcodec-58'
ffmpeg_avformat = 'avformat-58'
ffmpeg_avutil = 'avutil-56'
ffmpeg_swresample = 'swresample-3'
prebuilt_ffmpeg = 'ffmpeg-win32-4.3.1'
prebuilt_sdl2 = 'SDL2-2.26.1/i686-w64-mingw32'
prebuilt_libusb_root = 'libusb-1.0.26'

View File

@@ -19,7 +19,6 @@ endian = 'little'
ffmpeg_avcodec = 'avcodec-59'
ffmpeg_avformat = 'avformat-59'
ffmpeg_avutil = 'avutil-57'
ffmpeg_swresample = 'swresample-4'
prebuilt_ffmpeg = 'ffmpeg-win64-5.1.2'
prebuilt_sdl2 = 'SDL2-2.26.1/x86_64-w64-mingw32'
prebuilt_libusb_root = 'libusb-1.0.26'

View File

@@ -1,5 +1,5 @@
distributionBase=GRADLE_USER_HOME
distributionPath=wrapper/dists
distributionUrl=https\://services.gradle.org/distributions/gradle-7.5-all.zip
distributionUrl=https\://services.gradle.org/distributions/gradle-7.3.3-all.zip
zipStoreBase=GRADLE_USER_HOME
zipStorePath=wrapper/dists

View File

@@ -1,7 +1,6 @@
apply plugin: 'com.android.application'
android {
namespace 'com.genymobile.scrcpy'
compileSdkVersion 33
defaultConfig {
applicationId "com.genymobile.scrcpy"

View File

@@ -1,2 +1,2 @@
<!-- not a real Android application, it is run by app_process manually -->
<manifest />
<manifest package="com.genymobile.scrcpy"/>

View File

@@ -1,11 +1,7 @@
package com.genymobile.scrcpy;
import com.genymobile.scrcpy.wrappers.ServiceManager;
import android.annotation.SuppressLint;
import android.annotation.TargetApi;
import android.content.ComponentName;
import android.content.Intent;
import android.media.AudioFormat;
import android.media.AudioRecord;
import android.media.AudioTimestamp;
@@ -15,19 +11,16 @@ import android.media.MediaRecorder;
import android.os.Build;
import android.os.Handler;
import android.os.HandlerThread;
import android.os.Looper;
import android.os.SystemClock;
import java.io.IOException;
import java.nio.ByteBuffer;
import java.util.List;
import java.util.concurrent.ArrayBlockingQueue;
import java.util.concurrent.BlockingQueue;
public final class AudioEncoder {
private static class InputTask {
private final int index;
final int index;
InputTask(int index) {
this.index = index;
@@ -35,8 +28,8 @@ public final class AudioEncoder {
}
private static class OutputTask {
private final int index;
private final MediaCodec.BufferInfo bufferInfo;
final int index;
final MediaCodec.BufferInfo bufferInfo;
OutputTask(int index, MediaCodec.BufferInfo bufferInfo) {
this.index = index;
@@ -45,18 +38,16 @@ public final class AudioEncoder {
}
private static final int SAMPLE_RATE = 48000;
private static final int CHANNEL_CONFIG = AudioFormat.CHANNEL_IN_STEREO;
private static final int CHANNELS = 2;
private static final int FORMAT = AudioFormat.ENCODING_PCM_16BIT;
private static final int BYTES_PER_SAMPLE = 2;
private static final int BUFFER_MS = 5; // milliseconds
private static final int BUFFER_SIZE = SAMPLE_RATE * CHANNELS * BYTES_PER_SAMPLE * BUFFER_MS / 1000;
private static final int BUFFER_MS = 15; // milliseconds
private static final int BUFFER_SIZE = SAMPLE_RATE * CHANNELS * BUFFER_MS / 1000;
private final Streamer streamer;
private final int bitRate;
private final List<CodecOption> codecOptions;
private final String encoderName;
private AudioRecord recorder;
private MediaCodec mediaCodec;
// Capacity of 64 is in practice "infinite" (it is limited by the number of available MediaCodec buffers, typically 4).
// So many pending tasks would lead to an unacceptable delay anyway.
@@ -71,18 +62,16 @@ public final class AudioEncoder {
private boolean ended;
public AudioEncoder(Streamer streamer, int bitRate, List<CodecOption> codecOptions, String encoderName) {
public AudioEncoder(Streamer streamer, int bitRate) {
this.streamer = streamer;
this.bitRate = bitRate;
this.codecOptions = codecOptions;
this.encoderName = encoderName;
}
private static AudioFormat createAudioFormat() {
AudioFormat.Builder builder = new AudioFormat.Builder();
builder.setEncoding(FORMAT);
builder.setEncoding(AudioFormat.ENCODING_PCM_16BIT);
builder.setSampleRate(SAMPLE_RATE);
builder.setChannelMask(CHANNEL_CONFIG);
builder.setChannelMask(CHANNELS == 2 ? AudioFormat.CHANNEL_IN_STEREO : AudioFormat.CHANNEL_IN_MONO);
return builder.build();
}
@@ -96,32 +85,21 @@ public final class AudioEncoder {
}
builder.setAudioSource(MediaRecorder.AudioSource.REMOTE_SUBMIX);
builder.setAudioFormat(createAudioFormat());
int minBufferSize = AudioRecord.getMinBufferSize(SAMPLE_RATE, CHANNEL_CONFIG, FORMAT);
builder.setBufferSizeInBytes(minBufferSize);
builder.setBufferSizeInBytes(1024 * 1024);
return builder.build();
}
private static MediaFormat createFormat(String mimeType, int bitRate, List<CodecOption> codecOptions) {
private static MediaFormat createFormat(String mimeType, int bitRate) {
MediaFormat format = new MediaFormat();
format.setString(MediaFormat.KEY_MIME, mimeType);
format.setInteger(MediaFormat.KEY_BIT_RATE, bitRate);
format.setInteger(MediaFormat.KEY_CHANNEL_COUNT, CHANNELS);
format.setInteger(MediaFormat.KEY_SAMPLE_RATE, SAMPLE_RATE);
if (codecOptions != null) {
for (CodecOption option : codecOptions) {
String key = option.getKey();
Object value = option.getValue();
CodecUtils.setCodecOption(format, key, value);
Ln.d("Audio codec option set: " + key + " (" + value.getClass().getSimpleName() + ") = " + value);
}
}
return format;
}
@TargetApi(Build.VERSION_CODES.N)
private void inputThread(MediaCodec mediaCodec, AudioRecord recorder) throws IOException, InterruptedException {
private void inputThread() throws IOException, InterruptedException {
final AudioTimestamp timestamp = new AudioTimestamp();
long previousPts = 0;
long nextPts = 0;
@@ -165,7 +143,7 @@ public final class AudioEncoder {
}
}
private void outputThread(MediaCodec mediaCodec) throws IOException, InterruptedException {
private void outputThread() throws IOException, InterruptedException {
streamer.writeHeader();
while (!Thread.currentThread().isInterrupted()) {
@@ -220,77 +198,32 @@ public final class AudioEncoder {
}
}
private static void startWorkaroundAndroid11() {
if (Build.VERSION.SDK_INT == Build.VERSION_CODES.R) {
// Android 11 requires Apps to be at foreground to record audio.
// Normally, each App has its own user ID, so Android checks whether the requesting App has the user ID that's at the foreground.
// But Scrcpy server is NOT an App, it's a Java application started from Android shell, so it has the same user ID (2000) with Android
// shell ("com.android.shell").
// If there is an Activity from Android shell running at foreground, then the permission system will believe Scrcpy is also in the
// foreground.
if (Build.VERSION.SDK_INT == Build.VERSION_CODES.R) {
Intent intent = new Intent(Intent.ACTION_MAIN);
intent.addFlags(Intent.FLAG_ACTIVITY_NEW_TASK);
intent.addCategory(Intent.CATEGORY_LAUNCHER);
intent.setComponent(new ComponentName(FakeContext.PACKAGE_NAME, "com.android.shell.HeapDumpActivity"));
ServiceManager.getActivityManager().startActivityAsUserWithFeature(intent);
// Wait for activity to start
SystemClock.sleep(150);
}
}
}
private static void stopWorkaroundAndroid11() {
if (Build.VERSION.SDK_INT == Build.VERSION_CODES.R) {
ServiceManager.getActivityManager().forceStopPackage(FakeContext.PACKAGE_NAME);
}
}
@TargetApi(Build.VERSION_CODES.M)
public void encode() throws IOException {
if (Build.VERSION.SDK_INT < Build.VERSION_CODES.R) {
Ln.w("Audio disabled: it is not supported before Android 11");
streamer.writeDisableStream(false);
return;
}
MediaCodec mediaCodec = null;
AudioRecord recorder = null;
boolean mediaCodecStarted = false;
boolean recorderStarted = false;
boolean configurationError = false;
try {
Codec codec = streamer.getCodec();
mediaCodec = createMediaCodec(codec, encoderName);
mediaCodecThread = new HandlerThread("AudioEncoder");
mediaCodecThread.start();
MediaFormat format = createFormat(codec.getMimeType(), bitRate, codecOptions);
mediaCodec.setCallback(new EncoderCallback(), new Handler(mediaCodecThread.getLooper()));
mediaCodec.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
startWorkaroundAndroid11();
try {
recorder = createAudioRecord();
recorder.startRecording();
} catch (UnsupportedOperationException e) {
if (Build.VERSION.SDK_INT == Build.VERSION_CODES.R) {
Ln.e("Failed to start audio capture");
Ln.e("On Android 11, it is only possible to capture in foreground, make sure that the device is unlocked when starting scrcpy.");
throw new ConfigurationException("Unsupported audio capture");
}
} finally {
stopWorkaroundAndroid11();
}
recorderStarted = true;
String mimeType = streamer.getCodec().getMimeType();
mediaCodec = MediaCodec.createEncoderByType(mimeType); // may throw IOException
recorder = createAudioRecord();
mediaCodecThread = new HandlerThread("AudioEncoder");
mediaCodecThread.start();
MediaFormat format = createFormat(mimeType, bitRate);
mediaCodec.setCallback(new EncoderCallback(), new Handler(mediaCodecThread.getLooper()));
mediaCodec.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
recorder.startRecording();
} catch (Throwable e) {
// Notify the client that the audio could not be captured
streamer.writeDisableStream();
throw e;
}
final MediaCodec mediaCodecRef = mediaCodec;
final AudioRecord recorderRef = recorder;
inputThread = new Thread(() -> {
try {
inputThread(mediaCodecRef, recorderRef);
inputThread();
} catch (IOException | InterruptedException e) {
Ln.e("Audio capture error", e);
} finally {
@@ -300,7 +233,7 @@ public final class AudioEncoder {
outputThread = new Thread(() -> {
try {
outputThread(mediaCodecRef);
outputThread();
} catch (InterruptedException e) {
// this is expected on close
} catch (IOException e) {
@@ -314,78 +247,43 @@ public final class AudioEncoder {
});
mediaCodec.start();
mediaCodecStarted = true;
inputThread.start();
outputThread.start();
waitEnded();
} catch (ConfigurationException e) {
// Do not print stack trace, a user-friendly error-message has already been logged
// Notify the error to scrcpy to make it exit
configurationError = true;
} finally {
if (!recorderStarted) {
// Notify the client that the audio could not be captured
streamer.writeDisableStream(configurationError);
}
// Cleanup everything (either at the end or on error at any step of the initialization)
if (mediaCodecThread != null) {
Looper looper = mediaCodecThread.getLooper();
if (looper != null) {
looper.quitSafely();
}
}
if (inputThread != null) {
inputThread.interrupt();
}
if (outputThread != null) {
outputThread.interrupt();
}
try {
if (mediaCodecThread != null) {
mediaCodecThread.join();
}
if (inputThread != null) {
inputThread.join();
}
if (outputThread != null) {
outputThread.join();
}
} catch (InterruptedException e) {
// Should never happen
throw new AssertionError(e);
}
} catch (Throwable e) {
if (mediaCodec != null) {
if (mediaCodecStarted) {
mediaCodec.stop();
}
mediaCodec.release();
}
if (recorder != null) {
if (recorderStarted) {
recorder.stop();
}
recorder.release();
}
throw e;
}
try {
waitEnded();
} finally {
cleanUp();
}
}
private static MediaCodec createMediaCodec(Codec codec, String encoderName) throws IOException, ConfigurationException {
if (encoderName != null) {
Ln.d("Creating audio encoder by name: '" + encoderName + "'");
try {
return MediaCodec.createByCodecName(encoderName);
} catch (IllegalArgumentException e) {
Ln.e("Encoder '" + encoderName + "' for " + codec.getName() + " not found\n" + LogUtils.buildAudioEncoderListMessage());
throw new ConfigurationException("Unknown encoder: " + encoderName);
}
private void cleanUp() {
mediaCodecThread.getLooper().quit();
inputThread.interrupt();
outputThread.interrupt();
try {
mediaCodecThread.join();
inputThread.join();
outputThread.join();
} catch (InterruptedException e) {
// Should never happen
throw new AssertionError(e);
}
MediaCodec mediaCodec = MediaCodec.createEncoderByType(codec.getMimeType());
Ln.d("Using audio encoder: '" + mediaCodec.getName() + "'");
return mediaCodec;
mediaCodec.stop();
mediaCodec.release();
recorder.stop();
recorder.release();
}
private class EncoderCallback extends MediaCodec.Callback {

View File

@@ -139,7 +139,7 @@ public final class CleanUp {
builder.start();
}
public static void unlinkSelf() {
private static void unlinkSelf() {
try {
new File(SERVER_PATH).delete();
} catch (Exception e) {

View File

@@ -2,10 +2,7 @@ package com.genymobile.scrcpy;
public interface Codec {
enum Type {
VIDEO,
AUDIO,
}
enum Type {VIDEO, AUDIO}
Type getType();

View File

@@ -1,78 +0,0 @@
package com.genymobile.scrcpy;
import android.media.MediaCodecInfo;
import android.media.MediaCodecList;
import android.media.MediaFormat;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
public final class CodecUtils {
public static final class DeviceEncoder {
private final Codec codec;
private final MediaCodecInfo info;
DeviceEncoder(Codec codec, MediaCodecInfo info) {
this.codec = codec;
this.info = info;
}
public Codec getCodec() {
return codec;
}
public MediaCodecInfo getInfo() {
return info;
}
}
private CodecUtils() {
// not instantiable
}
public static void setCodecOption(MediaFormat format, String key, Object value) {
if (value instanceof Integer) {
format.setInteger(key, (Integer) value);
} else if (value instanceof Long) {
format.setLong(key, (Long) value);
} else if (value instanceof Float) {
format.setFloat(key, (Float) value);
} else if (value instanceof String) {
format.setString(key, (String) value);
}
}
private static MediaCodecInfo[] getEncoders(MediaCodecList codecs, String mimeType) {
List<MediaCodecInfo> result = new ArrayList<>();
for (MediaCodecInfo codecInfo : codecs.getCodecInfos()) {
if (codecInfo.isEncoder() && Arrays.asList(codecInfo.getSupportedTypes()).contains(mimeType)) {
result.add(codecInfo);
}
}
return result.toArray(new MediaCodecInfo[result.size()]);
}
public static List<DeviceEncoder> listVideoEncoders() {
List<DeviceEncoder> encoders = new ArrayList<>();
MediaCodecList codecs = new MediaCodecList(MediaCodecList.REGULAR_CODECS);
for (VideoCodec codec : VideoCodec.values()) {
for (MediaCodecInfo info : getEncoders(codecs, codec.getMimeType())) {
encoders.add(new DeviceEncoder(codec, info));
}
}
return encoders;
}
public static List<DeviceEncoder> listAudioEncoders() {
List<DeviceEncoder> encoders = new ArrayList<>();
MediaCodecList codecs = new MediaCodecList(MediaCodecList.REGULAR_CODECS);
for (AudioCodec codec : AudioCodec.values()) {
for (MediaCodecInfo info : getEncoders(codecs, codec.getMimeType())) {
encoders.add(new DeviceEncoder(codec, info));
}
}
return encoders;
}
}

View File

@@ -1,7 +0,0 @@
package com.genymobile.scrcpy;
public class ConfigurationException extends Exception {
public ConfigurationException(String message) {
super(message);
}
}

View File

@@ -61,12 +61,12 @@ public final class Device {
private final boolean supportsInputEvents;
public Device(Options options) throws ConfigurationException {
public Device(Options options) {
displayId = options.getDisplayId();
DisplayInfo displayInfo = ServiceManager.getDisplayManager().getDisplayInfo(displayId);
if (displayInfo == null) {
Ln.e("Display " + displayId + " not found\n" + LogUtils.buildDisplayListMessage());
throw new ConfigurationException("Unknown display id: " + displayId);
int[] displayIds = ServiceManager.getDisplayManager().getDisplayIds();
throw new InvalidDisplayIdException(displayId, displayIds);
}
int displayInfoFlags = displayInfo.getFlags();

View File

@@ -9,7 +9,6 @@ import android.os.Process;
public final class FakeContext extends ContextWrapper {
public static final String PACKAGE_NAME = "com.android.shell";
public static final int ROOT_UID = 0; // Like android.os.Process.ROOT_UID, but before API 29
private static final FakeContext INSTANCE = new FakeContext();

View File

@@ -0,0 +1,21 @@
package com.genymobile.scrcpy;
public class InvalidDisplayIdException extends RuntimeException {
private final int displayId;
private final int[] availableDisplayIds;
public InvalidDisplayIdException(int displayId, int[] availableDisplayIds) {
super("There is no display having id " + displayId);
this.displayId = displayId;
this.availableDisplayIds = availableDisplayIds;
}
public int getDisplayId() {
return displayId;
}
public int[] getAvailableDisplayIds() {
return availableDisplayIds;
}
}

View File

@@ -0,0 +1,23 @@
package com.genymobile.scrcpy;
import android.media.MediaCodecInfo;
public class InvalidEncoderException extends RuntimeException {
private final String name;
private final MediaCodecInfo[] availableEncoders;
public InvalidEncoderException(String name, MediaCodecInfo[] availableEncoders) {
super("There is no encoder having name '" + name + '"');
this.name = name;
this.availableEncoders = availableEncoders;
}
public String getName() {
return name;
}
public MediaCodecInfo[] getAvailableEncoders() {
return availableEncoders;
}
}

View File

@@ -1,63 +0,0 @@
package com.genymobile.scrcpy;
import com.genymobile.scrcpy.wrappers.DisplayManager;
import com.genymobile.scrcpy.wrappers.ServiceManager;
import java.util.List;
public final class LogUtils {
private LogUtils() {
// not instantiable
}
public static String buildVideoEncoderListMessage() {
StringBuilder builder = new StringBuilder("List of video encoders:");
List<CodecUtils.DeviceEncoder> videoEncoders = CodecUtils.listVideoEncoders();
if (videoEncoders.isEmpty()) {
builder.append("\n (none)");
} else {
for (CodecUtils.DeviceEncoder encoder : videoEncoders) {
builder.append("\n --video-codec=").append(encoder.getCodec().getName());
builder.append(" --video-encoder='").append(encoder.getInfo().getName()).append("'");
}
}
return builder.toString();
}
public static String buildAudioEncoderListMessage() {
StringBuilder builder = new StringBuilder("List of audio encoders:");
List<CodecUtils.DeviceEncoder> audioEncoders = CodecUtils.listAudioEncoders();
if (audioEncoders.isEmpty()) {
builder.append("\n (none)");
} else {
for (CodecUtils.DeviceEncoder encoder : audioEncoders) {
builder.append("\n --audio-codec=").append(encoder.getCodec().getName());
builder.append(" --audio-encoder='").append(encoder.getInfo().getName()).append("'");
}
}
return builder.toString();
}
public static String buildDisplayListMessage() {
StringBuilder builder = new StringBuilder("List of displays:");
DisplayManager displayManager = ServiceManager.getDisplayManager();
int[] displayIds = displayManager.getDisplayIds();
if (displayIds == null || displayIds.length == 0) {
builder.append("\n (none)");
} else {
for (int id : displayIds) {
builder.append("\n --display=").append(id).append(" (");
DisplayInfo displayInfo = displayManager.getDisplayInfo(id);
if (displayInfo != null) {
Size size = displayInfo.getSize();
builder.append(size.getWidth()).append("x").append(size.getHeight());
} else {
builder.append("size unknown");
}
builder.append(")");
}
}
return builder.toString();
}
}

View File

@@ -10,9 +10,9 @@ public class Options {
private int scid = -1; // 31-bit non-negative value, or -1
private boolean audio = true;
private int maxSize;
private VideoCodec videoCodec = VideoCodec.H264;
private VideoCodec codec = VideoCodec.H264;
private AudioCodec audioCodec = AudioCodec.OPUS;
private int videoBitRate = 8000000;
private int bitRate = 8000000;
private int audioBitRate = 196000;
private int maxFps;
private int lockVideoOrientation = -1;
@@ -22,20 +22,14 @@ public class Options {
private int displayId;
private boolean showTouches;
private boolean stayAwake;
private List<CodecOption> videoCodecOptions;
private List<CodecOption> audioCodecOptions;
private String videoEncoder;
private String audioEncoder;
private List<CodecOption> codecOptions;
private String encoderName;
private boolean powerOffScreenOnClose;
private boolean clipboardAutosync = true;
private boolean downsizeOnError = true;
private boolean cleanup = true;
private boolean powerOn = true;
private boolean listEncoders;
private boolean listDisplays;
// Options not used by the scrcpy client, but useful to use scrcpy-server directly
private boolean sendDeviceMeta = true; // send device name and size
private boolean sendFrameMeta = true; // send PTS so that the client may record properly
@@ -74,12 +68,12 @@ public class Options {
this.maxSize = maxSize;
}
public VideoCodec getVideoCodec() {
return videoCodec;
public VideoCodec getCodec() {
return codec;
}
public void setVideoCodec(VideoCodec videoCodec) {
this.videoCodec = videoCodec;
public void setCodec(VideoCodec codec) {
this.codec = codec;
}
public AudioCodec getAudioCodec() {
@@ -90,12 +84,12 @@ public class Options {
this.audioCodec = audioCodec;
}
public int getVideoBitRate() {
return videoBitRate;
public int getBitRate() {
return bitRate;
}
public void setVideoBitRate(int videoBitRate) {
this.videoBitRate = videoBitRate;
public void setBitRate(int bitRate) {
this.bitRate = bitRate;
}
public int getAudioBitRate() {
@@ -170,36 +164,20 @@ public class Options {
this.stayAwake = stayAwake;
}
public List<CodecOption> getVideoCodecOptions() {
return videoCodecOptions;
public List<CodecOption> getCodecOptions() {
return codecOptions;
}
public void setVideoCodecOptions(List<CodecOption> videoCodecOptions) {
this.videoCodecOptions = videoCodecOptions;
public void setCodecOptions(List<CodecOption> codecOptions) {
this.codecOptions = codecOptions;
}
public List<CodecOption> getAudioCodecOptions() {
return audioCodecOptions;
public String getEncoderName() {
return encoderName;
}
public void setAudioCodecOptions(List<CodecOption> audioCodecOptions) {
this.audioCodecOptions = audioCodecOptions;
}
public String getVideoEncoder() {
return videoEncoder;
}
public void setVideoEncoder(String videoEncoder) {
this.videoEncoder = videoEncoder;
}
public String getAudioEncoder() {
return audioEncoder;
}
public void setAudioEncoder(String audioEncoder) {
this.audioEncoder = audioEncoder;
public void setEncoderName(String encoderName) {
this.encoderName = encoderName;
}
public void setPowerOffScreenOnClose(boolean powerOffScreenOnClose) {
@@ -242,22 +220,6 @@ public class Options {
this.powerOn = powerOn;
}
public boolean getListEncoders() {
return listEncoders;
}
public void setListEncoders(boolean listEncoders) {
this.listEncoders = listEncoders;
}
public boolean getListDisplays() {
return listDisplays;
}
public void setListDisplays(boolean listDisplays) {
this.listDisplays = listDisplays;
}
public boolean getSendDeviceMeta() {
return sendDeviceMeta;
}

View File

@@ -5,6 +5,7 @@ import com.genymobile.scrcpy.wrappers.SurfaceControl;
import android.graphics.Rect;
import android.media.MediaCodec;
import android.media.MediaCodecInfo;
import android.media.MediaCodecList;
import android.media.MediaFormat;
import android.os.Build;
import android.os.IBinder;
@@ -13,6 +14,8 @@ import android.view.Surface;
import java.io.IOException;
import java.nio.ByteBuffer;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import java.util.concurrent.atomic.AtomicBoolean;
@@ -32,18 +35,18 @@ public class ScreenEncoder implements Device.RotationListener {
private final Streamer streamer;
private final String encoderName;
private final List<CodecOption> codecOptions;
private final int videoBitRate;
private final int bitRate;
private final int maxFps;
private final boolean downsizeOnError;
private boolean firstFrameSent;
private int consecutiveErrors;
public ScreenEncoder(Device device, Streamer streamer, int videoBitRate, int maxFps, List<CodecOption> codecOptions, String encoderName,
public ScreenEncoder(Device device, Streamer streamer, int bitRate, int maxFps, List<CodecOption> codecOptions, String encoderName,
boolean downsizeOnError) {
this.device = device;
this.streamer = streamer;
this.videoBitRate = videoBitRate;
this.bitRate = bitRate;
this.maxFps = maxFps;
this.codecOptions = codecOptions;
this.encoderName = encoderName;
@@ -59,10 +62,10 @@ public class ScreenEncoder implements Device.RotationListener {
return rotationChanged.getAndSet(false);
}
public void streamScreen() throws IOException, ConfigurationException {
Codec codec = streamer.getCodec();
MediaCodec mediaCodec = createMediaCodec(codec, encoderName);
MediaFormat format = createFormat(codec.getMimeType(), videoBitRate, maxFps, codecOptions);
public void streamScreen() throws IOException {
String videoMimeType = streamer.getCodec().getMimeType();
MediaCodec codec = createCodec(videoMimeType, encoderName);
MediaFormat format = createFormat(videoMimeType, bitRate, maxFps, codecOptions);
IBinder display = createDisplay();
device.setRotationListener(this);
@@ -81,8 +84,8 @@ public class ScreenEncoder implements Device.RotationListener {
Surface surface = null;
try {
mediaCodec.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
surface = mediaCodec.createInputSurface();
codec.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
surface = codec.createInputSurface();
// does not include the locked video orientation
Rect unlockedVideoRect = screenInfo.getUnlockedVideoSize().toRect();
@@ -90,11 +93,11 @@ public class ScreenEncoder implements Device.RotationListener {
int layerStack = device.getLayerStack();
setDisplaySurface(display, surface, videoRotation, contentRect, unlockedVideoRect, layerStack);
mediaCodec.start();
codec.start();
alive = encode(mediaCodec, streamer);
alive = encode(codec, streamer);
// do not call stop() on exception, it would trigger an IllegalStateException
mediaCodec.stop();
codec.stop();
} catch (IllegalStateException | IllegalArgumentException e) {
Ln.e("Encoding error: " + e.getClass().getName() + ": " + e.getMessage());
if (!prepareRetry(device, screenInfo)) {
@@ -103,14 +106,14 @@ public class ScreenEncoder implements Device.RotationListener {
Ln.i("Retrying...");
alive = true;
} finally {
mediaCodec.reset();
codec.reset();
if (surface != null) {
surface.release();
}
}
} while (alive);
} finally {
mediaCodec.release();
codec.release();
device.setRotationListener(null);
SurfaceControl.destroyDisplay(display);
}
@@ -196,19 +199,47 @@ public class ScreenEncoder implements Device.RotationListener {
return !eof;
}
private static MediaCodec createMediaCodec(Codec codec, String encoderName) throws IOException, ConfigurationException {
private static MediaCodecInfo[] listEncoders(String videoMimeType) {
List<MediaCodecInfo> result = new ArrayList<>();
MediaCodecList list = new MediaCodecList(MediaCodecList.REGULAR_CODECS);
for (MediaCodecInfo codecInfo : list.getCodecInfos()) {
if (codecInfo.isEncoder() && Arrays.asList(codecInfo.getSupportedTypes()).contains(videoMimeType)) {
result.add(codecInfo);
}
}
return result.toArray(new MediaCodecInfo[result.size()]);
}
private static MediaCodec createCodec(String videoMimeType, String encoderName) throws IOException {
if (encoderName != null) {
Ln.d("Creating encoder by name: '" + encoderName + "'");
try {
return MediaCodec.createByCodecName(encoderName);
} catch (IllegalArgumentException e) {
Ln.e("Encoder '" + encoderName + "' for " + codec.getName() + " not found\n" + LogUtils.buildVideoEncoderListMessage());
throw new ConfigurationException("Unknown encoder: " + encoderName);
MediaCodecInfo[] encoders = listEncoders(videoMimeType);
throw new InvalidEncoderException(encoderName, encoders);
}
}
MediaCodec mediaCodec = MediaCodec.createEncoderByType(codec.getMimeType());
Ln.d("Using encoder: '" + mediaCodec.getName() + "'");
return mediaCodec;
MediaCodec codec = MediaCodec.createEncoderByType(videoMimeType);
Ln.d("Using encoder: '" + codec.getName() + "'");
return codec;
}
private static void setCodecOption(MediaFormat format, CodecOption codecOption) {
String key = codecOption.getKey();
Object value = codecOption.getValue();
if (value instanceof Integer) {
format.setInteger(key, (Integer) value);
} else if (value instanceof Long) {
format.setLong(key, (Long) value);
} else if (value instanceof Float) {
format.setFloat(key, (Float) value);
} else if (value instanceof String) {
format.setString(key, (String) value);
}
Ln.d("Codec option set: " + key + " (" + value.getClass().getSimpleName() + ") = " + value);
}
private static MediaFormat createFormat(String videoMimeType, int bitRate, int maxFps, List<CodecOption> codecOptions) {
@@ -230,10 +261,7 @@ public class ScreenEncoder implements Device.RotationListener {
if (codecOptions != null) {
for (CodecOption option : codecOptions) {
String key = option.getKey();
Object value = option.getValue();
CodecUtils.setCodecOption(format, key, value);
Ln.d("Video codec option set: " + key + " (" + value.getClass().getSimpleName() + ") = " + value);
setCodecOption(format, option);
}
}

View File

@@ -1,6 +1,7 @@
package com.genymobile.scrcpy;
import android.graphics.Rect;
import android.media.MediaCodecInfo;
import android.os.BatteryManager;
import android.os.Build;
@@ -58,9 +59,10 @@ public final class Server {
}
}
private static void scrcpy(Options options) throws IOException, ConfigurationException {
private static void scrcpy(Options options) throws IOException {
Ln.i("Device: " + Build.MANUFACTURER + " " + Build.MODEL + " (Android " + Build.VERSION.RELEASE + ")");
final Device device = new Device(options);
List<CodecOption> codecOptions = options.getCodecOptions();
Thread initThread = startInitThread(options);
@@ -91,15 +93,14 @@ public final class Server {
Workarounds.fillAppInfo();
}
Controller controller = null;
AudioEncoder audioEncoder = null;
try (DesktopConnection connection = DesktopConnection.open(scid, tunnelForward, audio, control, sendDummyByte)) {
VideoCodec codec = options.getCodec();
if (options.getSendDeviceMeta()) {
Size videoSize = device.getScreenInfo().getVideoSize();
connection.sendDeviceMeta(Device.getDeviceName(), videoSize.getWidth(), videoSize.getHeight());
}
Controller controller = null;
if (control) {
controller = new Controller(device, connection, options.getClipboardAutosync(), options.getPowerOn());
controller.start();
@@ -108,17 +109,17 @@ public final class Server {
device.setClipboardListener(text -> controllerRef.getSender().pushClipboardText(text));
}
AudioEncoder audioEncoder = null;
if (audio) {
Streamer audioStreamer = new Streamer(connection.getAudioFd(), options.getAudioCodec(), options.getSendCodecId(),
options.getSendFrameMeta());
audioEncoder = new AudioEncoder(audioStreamer, options.getAudioBitRate(), options.getAudioCodecOptions(), options.getAudioEncoder());
audioEncoder = new AudioEncoder(audioStreamer, options.getAudioBitRate());
audioEncoder.start();
}
Streamer videoStreamer = new Streamer(connection.getVideoFd(), options.getVideoCodec(), options.getSendCodecId(),
options.getSendFrameMeta());
ScreenEncoder screenEncoder = new ScreenEncoder(device, videoStreamer, options.getVideoBitRate(), options.getMaxFps(),
options.getVideoCodecOptions(), options.getVideoEncoder(), options.getDownsizeOnError());
Streamer videoStreamer = new Streamer(connection.getVideoFd(), codec, options.getSendCodecId(), options.getSendFrameMeta());
ScreenEncoder screenEncoder = new ScreenEncoder(device, videoStreamer, options.getBitRate(), options.getMaxFps(),
codecOptions, options.getEncoderName(), options.getDownsizeOnError());
try {
// synchronous
screenEncoder.streamScreen();
@@ -127,27 +128,27 @@ public final class Server {
if (!IO.isBrokenPipe(e)) {
Ln.e("Video encoding error", e);
}
}
} finally {
Ln.d("Screen streaming stopped");
initThread.interrupt();
if (audioEncoder != null) {
audioEncoder.stop();
}
if (controller != null) {
controller.stop();
}
try {
initThread.join();
} finally {
Ln.d("Screen streaming stopped");
initThread.interrupt();
if (audioEncoder != null) {
audioEncoder.join();
audioEncoder.stop();
}
if (controller != null) {
controller.join();
controller.stop();
}
try {
initThread.join();
if (audioEncoder != null) {
audioEncoder.join();
}
if (controller != null) {
controller.join();
}
} catch (InterruptedException e) {
// ignore
}
} catch (InterruptedException e) {
// ignore
}
}
}
@@ -158,7 +159,6 @@ public final class Server {
return thread;
}
@SuppressWarnings("MethodLength")
private static Options createOptions(String... args) {
if (args.length < 1) {
throw new IllegalArgumentException("Missing client version");
@@ -196,12 +196,12 @@ public final class Server {
boolean audio = Boolean.parseBoolean(value);
options.setAudio(audio);
break;
case "video_codec":
VideoCodec videoCodec = VideoCodec.findByName(value);
if (videoCodec == null) {
case "codec":
VideoCodec codec = VideoCodec.findByName(value);
if (codec == null) {
throw new IllegalArgumentException("Video codec " + value + " not supported");
}
options.setVideoCodec(videoCodec);
options.setCodec(codec);
break;
case "audio_codec":
AudioCodec audioCodec = AudioCodec.findByName(value);
@@ -214,9 +214,9 @@ public final class Server {
int maxSize = Integer.parseInt(value) & ~7; // multiple of 8
options.setMaxSize(maxSize);
break;
case "video_bit_rate":
int videoBitRate = Integer.parseInt(value);
options.setVideoBitRate(videoBitRate);
case "bit_rate":
int bitRate = Integer.parseInt(value);
options.setBitRate(bitRate);
break;
case "audio_bit_rate":
int audioBitRate = Integer.parseInt(value);
@@ -254,23 +254,15 @@ public final class Server {
boolean stayAwake = Boolean.parseBoolean(value);
options.setStayAwake(stayAwake);
break;
case "video_codec_options":
List<CodecOption> videoCodecOptions = CodecOption.parse(value);
options.setVideoCodecOptions(videoCodecOptions);
case "codec_options":
List<CodecOption> codecOptions = CodecOption.parse(value);
options.setCodecOptions(codecOptions);
break;
case "audio_codec_options":
List<CodecOption> audioCodecOptions = CodecOption.parse(value);
options.setAudioCodecOptions(audioCodecOptions);
break;
case "video_encoder":
case "encoder_name":
if (!value.isEmpty()) {
options.setVideoEncoder(value);
options.setEncoderName(value);
}
break;
case "audio_encoder":
if (!value.isEmpty()) {
options.setAudioEncoder(value);
}
case "power_off_on_close":
boolean powerOffScreenOnClose = Boolean.parseBoolean(value);
options.setPowerOffScreenOnClose(powerOffScreenOnClose);
@@ -291,14 +283,6 @@ public final class Server {
boolean powerOn = Boolean.parseBoolean(value);
options.setPowerOn(powerOn);
break;
case "list_encoders":
boolean listEncoders = Boolean.parseBoolean(value);
options.setListEncoders(listEncoders);
break;
case "list_displays":
boolean listDisplays = Boolean.parseBoolean(value);
options.setListDisplays(listDisplays);
break;
case "send_device_meta":
boolean sendDeviceMeta = Boolean.parseBoolean(value);
options.setSendDeviceMeta(sendDeviceMeta);
@@ -349,35 +333,38 @@ public final class Server {
return new Rect(x, y, x + width, y + height);
}
private static void suggestFix(Throwable e) {
if (e instanceof InvalidDisplayIdException) {
InvalidDisplayIdException idie = (InvalidDisplayIdException) e;
int[] displayIds = idie.getAvailableDisplayIds();
if (displayIds != null && displayIds.length > 0) {
Ln.e("Try to use one of the available display ids:");
for (int id : displayIds) {
Ln.e(" scrcpy --display " + id);
}
}
} else if (e instanceof InvalidEncoderException) {
InvalidEncoderException iee = (InvalidEncoderException) e;
MediaCodecInfo[] encoders = iee.getAvailableEncoders();
if (encoders != null && encoders.length > 0) {
Ln.e("Try to use one of the available encoders:");
for (MediaCodecInfo encoder : encoders) {
Ln.e(" scrcpy --encoder '" + encoder.getName() + "'");
}
}
}
}
public static void main(String... args) throws Exception {
Thread.setDefaultUncaughtExceptionHandler((t, e) -> {
Ln.e("Exception on thread " + t, e);
suggestFix(e);
});
Options options = createOptions(args);
Ln.initLogLevel(options.getLogLevel());
if (options.getListEncoders() || options.getListDisplays()) {
if (options.getCleanup()) {
CleanUp.unlinkSelf();
}
if (options.getListEncoders()) {
Ln.i(LogUtils.buildVideoEncoderListMessage());
Ln.i(LogUtils.buildAudioEncoderListMessage());
}
if (options.getListDisplays()) {
Ln.i(LogUtils.buildDisplayListMessage());
}
// Just print the requested data, do not mirror
return;
}
try {
scrcpy(options);
} catch (ConfigurationException e) {
// Do not print stack trace, a user-friendly error-message has already been logged
}
scrcpy(options);
}
}

View File

@@ -40,15 +40,10 @@ public final class Streamer {
}
}
public void writeDisableStream(boolean error) throws IOException {
// Writing a specific code as codec-id means that the device disables the stream
// code 0: it explicitly disables the stream (because it could not capture audio), scrcpy should continue mirroring video only
// code 1: a configuration error occurred, scrcpy must be stopped
byte[] code = new byte[4];
if (error) {
code[3] = 1;
}
IO.writeFully(fd, code, 0, code.length);
public void writeDisableStream() throws IOException {
// Writing 0 (32-bit) as codec-id means that the device disables the stream (because it could not capture)
byte[] zeros = new byte[4];
IO.writeFully(fd, zeros, 0, zeros.length);
}
public void writePacket(ByteBuffer codecBuffer, MediaCodec.BufferInfo bufferInfo) throws IOException {
@@ -96,34 +91,31 @@ public final class Streamer {
// 00000050 00 00 00 |...|
//
// Each "section" is prefixed by a 64-bit ID and a 64-bit length.
//
// <https://developer.android.com/reference/android/media/MediaCodec#CSD>
boolean isConfig = (bufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0;
if (!isConfig) {
return;
}
if (buffer.remaining() < 16) {
throw new IOException("Not enough data in OPUS config packet");
while (buffer.remaining() >= 16) {
long id = buffer.getLong();
long sizeLong = buffer.getLong();
if (sizeLong < 0 || sizeLong >= 0x7FFFFFFF) {
throw new IOException("Invalid block size in OPUS header: " + sizeLong);
}
int size = (int) sizeLong;
if (id == AOPUSHDR) {
if (buffer.remaining() < size) {
throw new IOException("Not enough data in OPUS header (invalid size: " + size + ")");
}
// Set the buffer to point to the OPUS header slice
buffer.limit(buffer.position() + size);
return;
}
buffer.position(buffer.position() + size);
}
long id = buffer.getLong();
if (id != AOPUSHDR) {
throw new IOException("OPUS header not found");
}
long sizeLong = buffer.getLong();
if (sizeLong < 0 || sizeLong >= 0x7FFFFFFF) {
throw new IOException("Invalid block size in OPUS header: " + sizeLong);
}
int size = (int) sizeLong;
if (buffer.remaining() < size) {
throw new IOException("Not enough data in OPUS header (invalid size: " + size + ")");
}
// Set the buffer to point to the OPUS header slice
buffer.limit(buffer.position() + size);
throw new IOException("OPUS header not found");
}
}

View File

@@ -1,12 +1,10 @@
package com.genymobile.scrcpy;
import android.annotation.SuppressLint;
import android.media.MediaFormat;
public enum VideoCodec implements Codec {
H264(0x68_32_36_34, "h264", MediaFormat.MIMETYPE_VIDEO_AVC),
H265(0x68_32_36_35, "h265", MediaFormat.MIMETYPE_VIDEO_HEVC),
@SuppressLint("InlinedApi") // introduced in API 21
AV1(0x00_61_76_31, "av1", MediaFormat.MIMETYPE_VIDEO_AV1);
private final int id; // 4-byte ASCII representation of the name

View File

@@ -2,6 +2,7 @@ package com.genymobile.scrcpy;
import android.annotation.SuppressLint;
import android.app.Application;
import android.app.Instrumentation;
import android.content.ContextWrapper;
import android.content.pm.ApplicationInfo;
import android.os.Looper;

View File

@@ -1,30 +1,22 @@
package com.genymobile.scrcpy.wrappers;
import com.genymobile.scrcpy.FakeContext;
import com.genymobile.scrcpy.Ln;
import android.annotation.SuppressLint;
import android.annotation.TargetApi;
import android.content.Intent;
import android.os.Binder;
import android.os.Build;
import android.os.Bundle;
import android.os.IBinder;
import android.os.IInterface;
import android.os.Process;
import java.lang.reflect.Field;
import java.lang.reflect.InvocationTargetException;
import java.lang.reflect.Method;
@SuppressLint("PrivateApi,DiscouragedPrivateApi")
public class ActivityManager {
private final IInterface manager;
private Method getContentProviderExternalMethod;
private boolean getContentProviderExternalMethodNewVersion = true;
private Method removeContentProviderExternalMethod;
private Method startActivityAsUserWithFeatureMethod;
private Method forceStopPackageMethod;
public ActivityManager(IInterface manager) {
this.manager = manager;
@@ -51,17 +43,16 @@ public class ActivityManager {
return removeContentProviderExternalMethod;
}
@TargetApi(Build.VERSION_CODES.Q)
private ContentProvider getContentProviderExternal(String name, IBinder token) {
try {
Method method = getGetContentProviderExternalMethod();
Object[] args;
if (getContentProviderExternalMethodNewVersion) {
// new version
args = new Object[]{name, FakeContext.ROOT_UID, token, null};
args = new Object[]{name, Process.ROOT_UID, token, null};
} else {
// old version
args = new Object[]{name, FakeContext.ROOT_UID, token};
args = new Object[]{name, Process.ROOT_UID, token};
}
// ContentProviderHolder providerHolder = getContentProviderExternal(...);
Object providerHolder = method.invoke(manager, args);
@@ -94,55 +85,4 @@ public class ActivityManager {
public ContentProvider createSettingsProvider() {
return getContentProviderExternal("settings", new Binder());
}
private Method getStartActivityAsUserWithFeatureMethod() throws NoSuchMethodException, ClassNotFoundException {
if (startActivityAsUserWithFeatureMethod == null) {
Class<?> iApplicationThreadClass = Class.forName("android.app.IApplicationThread");
Class<?> profilerInfo = Class.forName("android.app.ProfilerInfo");
startActivityAsUserWithFeatureMethod = manager.getClass()
.getMethod("startActivityAsUserWithFeature", iApplicationThreadClass, String.class, String.class, Intent.class, String.class,
IBinder.class, String.class, int.class, int.class, profilerInfo, Bundle.class, int.class);
}
return startActivityAsUserWithFeatureMethod;
}
@SuppressWarnings("ConstantConditions")
public int startActivityAsUserWithFeature(Intent intent) {
try {
Method method = getStartActivityAsUserWithFeatureMethod();
return (int) method.invoke(
/* this */ manager,
/* caller */ null,
/* callingPackage */ FakeContext.PACKAGE_NAME,
/* callingFeatureId */ null,
/* intent */ intent,
/* resolvedType */ null,
/* resultTo */ null,
/* resultWho */ null,
/* requestCode */ 0,
/* startFlags */ 0,
/* profilerInfo */ null,
/* bOptions */ null,
/* userId */ /* UserHandle.USER_CURRENT */ -2);
} catch (Throwable e) {
Ln.e("Could not invoke method", e);
return 0;
}
}
private Method getForceStopPackageMethod() throws NoSuchMethodException {
if (forceStopPackageMethod == null) {
forceStopPackageMethod = manager.getClass().getMethod("forceStopPackage", String.class, int.class);
}
return forceStopPackageMethod;
}
public void forceStopPackage(String packageName) {
try {
Method method = getForceStopPackageMethod();
method.invoke(manager, packageName, /* userId */ /* UserHandle.USER_CURRENT */ -2);
} catch (Throwable e) {
Ln.e("Could not invoke method", e);
}
}
}

View File

@@ -7,6 +7,7 @@ import android.content.ClipData;
import android.content.IOnPrimaryClipChangedListener;
import android.os.Build;
import android.os.IInterface;
import android.os.Process;
import java.lang.reflect.InvocationTargetException;
import java.lang.reflect.Method;
@@ -62,9 +63,9 @@ public class ClipboardManager {
return (ClipData) method.invoke(manager, FakeContext.PACKAGE_NAME);
}
if (alternativeMethod) {
return (ClipData) method.invoke(manager, FakeContext.PACKAGE_NAME, null, FakeContext.ROOT_UID);
return (ClipData) method.invoke(manager, FakeContext.PACKAGE_NAME, null, Process.ROOT_UID);
}
return (ClipData) method.invoke(manager, FakeContext.PACKAGE_NAME, FakeContext.ROOT_UID);
return (ClipData) method.invoke(manager, FakeContext.PACKAGE_NAME, Process.ROOT_UID);
}
private static void setPrimaryClip(Method method, boolean alternativeMethod, IInterface manager, ClipData clipData)
@@ -72,9 +73,9 @@ public class ClipboardManager {
if (Build.VERSION.SDK_INT < Build.VERSION_CODES.Q) {
method.invoke(manager, clipData, FakeContext.PACKAGE_NAME);
} else if (alternativeMethod) {
method.invoke(manager, clipData, FakeContext.PACKAGE_NAME, null, FakeContext.ROOT_UID);
method.invoke(manager, clipData, FakeContext.PACKAGE_NAME, null, Process.ROOT_UID);
} else {
method.invoke(manager, clipData, FakeContext.PACKAGE_NAME, FakeContext.ROOT_UID);
method.invoke(manager, clipData, FakeContext.PACKAGE_NAME, Process.ROOT_UID);
}
}
@@ -109,9 +110,9 @@ public class ClipboardManager {
if (Build.VERSION.SDK_INT < Build.VERSION_CODES.Q) {
method.invoke(manager, listener, FakeContext.PACKAGE_NAME);
} else if (alternativeMethod) {
method.invoke(manager, listener, FakeContext.PACKAGE_NAME, null, FakeContext.ROOT_UID);
method.invoke(manager, listener, FakeContext.PACKAGE_NAME, null, Process.ROOT_UID);
} else {
method.invoke(manager, listener, FakeContext.PACKAGE_NAME, FakeContext.ROOT_UID);
method.invoke(manager, listener, FakeContext.PACKAGE_NAME, Process.ROOT_UID);
}
}

View File

@@ -9,6 +9,7 @@ import android.content.AttributionSource;
import android.os.Build;
import android.os.Bundle;
import android.os.IBinder;
import android.os.Process;
import java.io.Closeable;
import java.lang.reflect.InvocationTargetException;
@@ -138,7 +139,7 @@ public class ContentProvider implements Closeable {
public String getValue(String table, String key) throws SettingsException {
String method = getGetMethod(table);
Bundle arg = new Bundle();
arg.putInt(CALL_METHOD_USER_KEY, FakeContext.ROOT_UID);
arg.putInt(CALL_METHOD_USER_KEY, Process.ROOT_UID);
try {
Bundle bundle = call(method, key, arg);
if (bundle == null) {
@@ -154,7 +155,7 @@ public class ContentProvider implements Closeable {
public void putValue(String table, String key, String value) throws SettingsException {
String method = getPutMethod(table);
Bundle arg = new Bundle();
arg.putInt(CALL_METHOD_USER_KEY, FakeContext.ROOT_UID);
arg.putInt(CALL_METHOD_USER_KEY, Process.ROOT_UID);
arg.putString(NAME_VALUE_TABLE_VALUE, value);
try {
call(method, key, arg);