I don’t know what’s going on.
But we can use command “pkill” to delete those abnormal processes.
sudo pkill -f pgrep
Reference: stackoverflow: How to kill all processes with a given partial name?
I don’t know what’s going on.
But we can use command “pkill” to delete those abnormal processes.
sudo pkill -f pgrep
Reference: stackoverflow: How to kill all processes with a given partial name?
Source:
CSDN: InputStream类available和read方法可能读取不到完整的流数据
stackoverflow: How to read a http file inmemory without saving on local drive?
private static InputStream download(String sourceURL) throws Exception {
InputStream inputStream = new URL(sourceURL).openStream();
ByteArrayOutputStream output = new ByteArrayOutputStream();
byte[] buffer = new byte[4096];
int n = 0;
while (-1 != (n = inputStream.read(buffer))) {
output.write(buffer, 0, n);
}
return new ByteArrayInputStream(output.toByteArray());
}
When we use command “ssh” with auto-completion, we get this error result.
This is because we used the editor “sublime text” to edit the config before, which used the line endings of Windows. Then we can edit the setting of the sublime text editor to Mac OS.
Now we can get the desired result.
We can usually find the information about connecting to the Guest machine in VirtualBox. Most of the practices are performed using the Host-only setting method.
But in version 7.0.4 of VirtualBox, the same setting does not work. This is because the new network interface in the Guest is not enabled correctly. This document describes the steps on how to enable the network interface.
First check the number of the network interface.
ifconfig -a | more
Now we will get the number of the network interface is “enp0s8”
Edit the network configuration. (Here we use Ubuntu 22.04)
vim /etc/netplan/00-installer-config.yaml
Add enp0s8 settings to this file.
Execute the following command to apply the above settings.
netplan generate
netplan apply
public static void main(String[] args) throws Exception {
String number = "1000500000";
double amount = Double.parseDouble(number);
DecimalFormat intFormatter = new DecimalFormat("#,###");
DecimalFormat floatFormatter = new DecimalFormat("#,###.00");
System.out.println("==int==");
System.out.println(intFormatter.format(amount));
System.out.println(String.format("%,d", Integer.parseInt((number))));
System.out.println(number.replaceAll("(\\d)(?=(\\d{3})+$)", "$1,"));
System.out.println("==float==");
System.out.println(floatFormatter.format(amount));
System.out.println(String.format("%,.2f", Float.parseFloat((number))));
}
==int==
1,000,500,000
1,000,500,000
1,000,500,000
==float==
1,000,500,000.00
1,000,499,968.00
Reference: [stackoverflow] Mask some part of String
return String.replaceAll("\\b(\\d{3})\\d+(\\d{3})", "$1****$2");
# 1234567890 => 123****890
curl --trace-ascii - https://cowmanchiang.me/wp
0000: GET /wp HTTP/2
0010: Host: cowmanchiang.me
0027: user-agent: curl/7.79.1
0040: accept: */*
004d:
== Info: Connection state changed (MAX_CONCURRENT_STREAMS == 128)!
<= Recv header, 13 bytes (0xd)
0000: HTTP/2 301
<= Recv header, 15 bytes (0xf)
0000: server: nginx
<= Recv header, 37 bytes (0x25)
0000: date: Wed, 23 Nov 2022 09:14:23 GMT
<= Recv header, 45 bytes (0x2d)
0000: content-type: text/html; charset=iso-8859-1
<= Recv header, 21 bytes (0x15)
0000: content-length: 235
<= Recv header, 39 bytes (0x27)
0000: location: https://cowmanchiang.me/wp/
<= Recv header, 23 bytes (0x17)
0000: vary: Accept-Encoding
<= Recv header, 2 bytes (0x2)
0000:
<= Recv data, 235 bytes (0xeb)
0000: <!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">.<html><head>.
0040: <title>301 Moved Permanently</title>.</head><body>.<h1>Moved Per
0080: manently</h1>.<p>The document has moved <a href="https://cowmanc
00c0: hiang.me/wp/">here</a>.</p>.</body></html>.
<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
<html><head>
<title>301 Moved Permanently</title>
</head><body>
<h1>Moved Permanently</h1>
<p>The document has moved <a href="https://cowmanchiang.me/wp/">here</a>.</p>
</body></html>
== Info: Connection #0 to host cowmanchiang.me left intact
Install sshfs
Use brew (brew install sshfs) will get this error message, “Error: sshfs has been disabled because it requires FUSE!”
Ref. https://github.com/telepresenceio/telepresence/issues/1654#issuecomment-1204676705
So download and install https://github.com/osxfuse/sshfs/releases/download/osxfuse-sshfs-2.5.0/sshfs-2.5.0.pkg
Create local folder
# mkdir mountFolder
Mount remote folder
# sshfs name@ssh_host:/path/to/folder mountFoldfer
Umount floder
# umount mountFolder
Install rocksdb first.
brew install rocksdb
Add alias in zshrc
alias ldb = 'rocksdb_ldb --db=. '
List all column families
# ldb list_column_families
Column families in .:
{default, S1, User, C1, C2, U1, C3}
Scan command:
# ldb --column_family=User scan
User_U_name : cowman
User_U_status : 0
User_U_type : 0
User_U_updatedTimestamp :
User_U_userId : U
Show result with hex value
ldb --column_family=User scan --value_hex
User_U_name : 0x636F776D616E
User_U_status : 0x30
User_U_type : 0x30
User_U_updatedTimestamp : 0x0000000000000000
User_U_userId : 0x55
Command description (Some commands are inconsistent with how they are used on linux servers)
ldb - RocksDB Tool
commands MUST specify --db=<full_path_to_db_directory> when necessary
commands can optionally specify
--env_uri=<uri_of_environment> or --fs_uri=<uri_of_filesystem> if necessary
--secondary_path=<secondary_path> to open DB as secondary instance. Operations not supported in secondary instance will fail.
The following optional parameters control if keys/values are input/output as hex or as plain strings:
--key_hex : Keys are input/output as hex
--value_hex : Values are input/output as hex
--hex : Both keys and values are input/output as hex
The following optional parameters control the database internals:
--column_family=<string> : name of the column family to operate on. default: default column family
--ttl with 'put','get','scan','dump','query','batchput' : DB supports ttl and value is internally timestamp-suffixed
--try_load_options : Try to load option file from DB. Default to true if db is specified and not creating a new DB and not open as TTL DB. Can be set to false explicitly.
--disable_consistency_checks : Set options.force_consistency_checks = false.
--ignore_unknown_options : Ignore unknown options when loading option file.
--bloom_bits=<int,e.g.:14>
--fix_prefix_len=<int,e.g.:14>
--compression_type=<no|snappy|zlib|bzip2|lz4|lz4hc|xpress|zstd>
--compression_max_dict_bytes=<int,e.g.:16384>
--block_size=<block_size_in_bytes>
--auto_compaction=<true|false>
--db_write_buffer_size=<int,e.g.:16777216>
--write_buffer_size=<int,e.g.:4194304>
--file_size=<int,e.g.:2097152>
--enable_blob_files : Enable key-value separation using BlobDB
--min_blob_size=<int,e.g.:2097152>
--blob_file_size=<int,e.g.:2097152>
--blob_compression_type=<no|snappy|zlib|bzip2|lz4|lz4hc|xpress|zstd>
--enable_blob_garbage_collection : Enable blob garbage collection
--blob_garbage_collection_age_cutoff=<double,e.g.:0.25>
--blob_garbage_collection_force_threshold=<double,e.g.:0.25>
--blob_compaction_readahead_size=<int,e.g.:2097152>
Data Access Commands:
put <key> <value> [--create_if_missing] [--ttl]
get <key> [--ttl]
batchput <key> <value> [<key> <value>] [..] [--create_if_missing] [--ttl]
scan [--from] [--to] [--ttl] [--timestamp] [--max_keys=<N>q] [--start_time=<N>:- is inclusive] [--end_time=<N>:- is exclusive] [--no_value]
delete <key>
deleterange <begin key> <end key>
query [--ttl]
Starts a REPL shell. Type help for list of available commands.
approxsize [--from] [--to]
checkconsistency
list_file_range_deletes [--max_keys=<N>] : print tombstones in SST files.
Admin Commands:
dump_wal --walfile=<write_ahead_log_file_path> [--header] [--print_value] [--write_committed=true|false]
compact [--from] [--to]
reduce_levels --new_levels=<New number of levels> [--print_old_levels]
change_compaction_style --old_compaction_style=<Old compaction style: 0 for level compaction, 1 for universal compaction> --new_compaction_style=<New compaction style: 0 for level compaction, 1 for universal compaction>
dump [--from] [--to] [--ttl] [--max_keys=<N>] [--timestamp] [--count_only] [--count_delim=<char>] [--stats] [--bucket=<N>] [--start_time=<N>:- is inclusive] [--end_time=<N>:- is exclusive] [--path=<path_to_a_file>] [--decode_blob_index] [--dump_uncompressed_blobs]
load [--create_if_missing] [--disable_wal] [--bulk_load] [--compact]
manifest_dump [--verbose] [--json] [--path=<path_to_manifest_file>]
update_manifest [--update_temperatures] MUST NOT be used on a live DB.
file_checksum_dump [--path=<path_to_manifest_file>]
get_property <property_name>
list_column_families
create_column_family --db=<db_path> <new_column_family_name>
drop_column_family --db=<db_path> <column_family_name_to_drop>
dump_live_files [--decode_blob_index] [--dump_uncompressed_blobs]
idump [--from] [--to] [--input_key_hex] [--max_keys=<N>] [--count_only] [--count_delim=<char>] [--stats] [--decode_blob_index]
list_live_files_metadata [--sort_by_filename]
repair [--verbose]
backup [--backup_env_uri | --backup_fs_uri] [--backup_dir] [--num_threads] [--stderr_log_level=<int (InfoLogLevel)>]
restore [--backup_env_uri | --backup_fs_uri] [--backup_dir] [--num_threads] [--stderr_log_level=<int (InfoLogLevel)>]
checkpoint [--checkpoint_dir]
write_extern_sst <output_sst_path>
ingest_extern_sst <input_sst_path> [--move_files] [--snapshot_consistency] [--allow_global_seqno] [--allow_blocking_flush] [--ingest_behind] [--write_global_seqno]
unsafe_remove_sst_file <SST file number MUST NOT be used on a live DB.