Use Redis-shake To Migrate Self-built Redis To Alibaba Cloud
1. Experiment
1.1 Knowledge points
The experiment uses ApsaraDB for Redis. It introduces the process of migrating the user’s local Redis inventory and incremental data to Alibaba Cloud Redis with Redis-shake migration tool. ApsaraDB for Redis is a database service that is compatible with the open source Redis protocol standard and provides hybrid storage. It is designed base on the dual-system hot standby architecture and cluster architecture, which enables it to meet requirements of high throughput, low latency, and flexible configuration.
1.2 Experiment process
- Self-built Redis environment preparation
- Alibaba Cloud Redis environment preparation
- Use Redis-shake to migrate data
1.3 Scene architecture diagram
![image desc](https://labex.io/upload/N/O/M/xAIE6OsvdmUc.jpg)
1.4 Cloud resources required
1.5 Prerequisites
- During the experiment, you are allowed to choose your own Alibaba Cloud account over the account provided by this lab. However, to make sure the experiment goes smoothly, you need to choose the same Ubuntu 16.04 operation system for your ECS.
- Before starting the experiment, please confirm that the previous experiment has been closed normally and has been exited.
2. Start the experiment environment
Click Start Lab in the upper right corner of the page to start the experiment.
.
After the experiment environment is started and the system has deployed resources needed for the experiment in the background. For example, the ECS instance, RDS instance, Server Load Balancer instance, and OSS bucket. An account consisting of the username and password for logging on to the Web console of Alibaba Cloud is also provided.
![image desc](https://labex.io/upload/V/Y/J/K6ZmihEjVmNx.jpg)
After the experiment environment is started and related resources are properly deployed, the experiment starts a countdown. You have an hour to perform experimental operations. After the countdown ends, the experiment stops, and related resources are released. During the experiment, pay attention to the remaining time and arrange your time wisely. Next, use the username and password provided by the system to log on to the Web console of Alibaba Cloud and view related resources:
![image desc](https://labex.io/upload/Y/X/J/9xgwPbvjt1dl.jpg)
Go to the logon page of Alibaba Cloud console.
![image desc](https://labex.io/upload/W/A/T/7cUrpJ11BZxO.jpg)
Fill in the sub-user account and click on Next.
![image desc](https://labex.io/upload/X/I/C/KPJoubT6IWIC.jpg)
Fill in the sub-user password and click on Login.
![image desc](https://labex.io/upload/I/K/E/R6OEW68ywNjG.jpg)
After successfully log on to the console, you will see the page showing as the following figure.
![image desc](https://labex.io/upload/Y/K/F/e4vf4XHIHn8n.jpg)
3. Prepare self-built Redis environment
3.1 Log on to ECS
Click on Elastic Compute Service, as shown in the following figure.
![image desc](https://labex.io/upload/Q/C/E/vWl7sSgr7lBx.jpg)
We can see one running ECS instance in Silicon Valley region.
![image desc](https://labex.io/upload/S/E/D/vDZbYfsSiRRV.jpg)
Copy this ECS instance’s Internet IP address and remotely log on to this ECS (Ubuntu system) instance. For details of remote login, refer to login。
![image desc](https://labex.io/upload/Q/H/O/mbkcyjC1JCxF.jpg)
The default account name and password of the ECS instance:
Account name: root
Password: nkYHG890..
3.2 Install Redis
Enter the following command to download the Redis installation package.
wget https://labex-ali-data.oss-us-west-1.aliyuncs.com/redis/redis-5.0.12.tar.gz
![image desc](https://labex.io/upload/C/C/U/v31bqXJLFSYD.jpg)
Enter the following command to decompress the installation package.
tar -xzf redis-5.0.12.tar.gz
![image desc](https://labex.io/upload/A/M/B/4zBX5ihBBm0j.jpg)
Enter the following command to compile Redis.
cd redis-5.0.12 && make
![image desc](https://labex.io/upload/W/O/B/bvT8Tnrsvh1s.jpg)
Enter the command vim /etc/profile
, copy the following content to the file, save and exit.
export PATH=$PATH:/root/redis-5.0.12/src
![image desc](https://labex.io/upload/R/M/D/sLoOzXJdTMiB.jpg)
Enter the following command to for the modifications to take effect.
source /etc/profile
![image desc](https://labex.io/upload/O/J/U/P6n0UgrkHrRD.jpg)
Enter the following command to start the Redis server.
nohup redis-server &
![image desc](https://labex.io/upload/M/U/D/if3CiN5mBFBQ.jpg)
Enter the following command to start the Redis client.
redis-cli
dbsize
![image desc](https://labex.io/upload/T/P/K/j8zRbH1TsDPv.jpg)
Enter the following command to set the key in redis.
set name labex
get name
![image desc](https://labex.io/upload/B/J/L/RjayPes4Ix91.jpg)
3.3 Prepare the data
Enter exit
to exit the Redis client.
![image desc](https://labex.io/upload/P/C/O/f39faZ9oUY9n.jpg)
Enter the following command to install the pip3 tool.
cd && apt update && apt -y install python3-pip
![image desc](https://labex.io/upload/R/R/V/KNbxLYs5jYL1.jpg)
Enter the following command,
export LC_ALL=C
![image desc](https://labex.io/upload/V/C/S/WSAQkQPujlHD.jpg)
Enter the following command to install the Redis dependency package of python.
pip3 install redis
![image desc](https://labex.io/upload/R/T/D/X4JHM3N6fQlT.jpg)
Enter the command vim data.py
, copy the following content to the file, save and exit.
#!/usr/bin/python3
import redis
import random, string, os, time
from multiprocessing import Process
def getRedis():
r = redis.Redis(host = "127.0.0.1", port = 6379, db = 0)
return r
def createKey():
print ("Process :" + str(os.getpid()) + " is running")
r = getRedis()
for i in range(10000):
r.set('labex-' + str(os.getpid()) + '-' + str(i), 'data' + str(i))
if __name__ == "__main__":
list_process = []
for i in range(10):
p = Process(target = createKey)
list_process.append(p)
for p in list_process:
p.daemon = True
p.start()
for p in list_process:
p.join()
print ("success")
![image desc](https://labex.io/upload/B/K/P/gLrBZ7wgDFfu.jpg)
Enter the following command to execute the script and insert 100,000 pieces of data into Redis. The process takes about 1~2 minutes.
python3 data.py
![image desc](https://labex.io/upload/D/A/I/J24XIwRNmx56.jpg)
The data insert is complete.
Enter the following command, you can see the number of keys in Redis at the moment.
redis-cli
dbsize
![image desc](https://labex.io/upload/V/K/B/DVCcwrcWQO1q.jpg)
<font color='red'>The user can cut off the above result picture when doing the experiment and send it to the teacher, indicating that the part of the current chapter has been completed.</font>
For now, you need to enter the command exit
to exit the client.
![image desc](https://labex.io/upload/E/X/N/LjOpBH2fZV8i.jpg)
4. Prepare Alibaba Cloud Redis environment
4.1 Set the whitelist
Go to Alibaba Cloud Redis console as showing in the figure below .
![image desc](https://labex.io/upload/B/D/C/TThBILxBz0xY.jpg)
Select the US (Silicon Valley) area, you can see that there is an existed Redis instance.
![image desc](https://labex.io/upload/C/J/D/maECR2rkHQJe.jpg)
Click on the instance ID.
![image desc](https://labex.io/upload/O/E/K/8peo8ZYqmivq.jpg)
To set the whitelist, click on Modify.
![image desc](https://labex.io/upload/M/P/A/M2YyBvs7MgXr.jpg)
Add the internal network address of ECS.
![image desc](https://labex.io/upload/U/J/A/fNQiBVrwmHLa.jpg)
![image desc](https://labex.io/upload/U/V/T/uDFaKHyS25Na.jpg)
Now you can see the address is added.
![image desc](https://labex.io/upload/N/C/G/ihkHLzlQuipv.jpg)
4.2 Create Account
Click on Create as showing in the figure below.
![image desc](https://labex.io/upload/Y/Y/W/535TBHIz1NWC.jpg)
Set the account password to labex/Aliyun-test
as showing in the figure below, and click on OK.
![image desc](https://labex.io/upload/Q/T/D/RbRSIquhrKid.jpg)
The account is activating.
![image desc](https://labex.io/upload/C/C/K/aInh4MHg1N7d.jpg)
Now the account activation is complete.
![image desc](https://labex.io/upload/E/D/I/0VZEVLdqGWn9.jpg)
The other account appears here is the default account, which password has been set to “Aliyun-test” when creating the instance.
Now you can check the intranet connection address of Redis.
![image desc](https://labex.io/upload/B/M/H/yFkWX6z71yMN.jpg)
Now go back to the ECS command line.
Enter the following command to connect to Alibaba Cloud Redis. Please note to replace YOUR-ALI-REDIS-ADDR with your own Alibaba Cloud Redis intranet connection address.
redis-cli -h YOUR-ALI-REDIS-ADDR -p 6379 -a Aliyun-test
dbsize
![image desc](https://labex.io/upload/O/X/W/4oBFH4w5cw6S.jpg)
Now the account is successfully connect to Alibaba Redis.
Enter the command exit
to exit the client.
![image desc](https://labex.io/upload/H/D/I/hEfQDPBsChmD.jpg)
5. Use Redis-shake to migrate data
Enter the following command to download the Redis-shake installation package.
wget http://labex-ali-data.oss-us-west-1.aliyuncs.com/redis/redis-shake-v2.0.3.tar.gz
![image desc](https://labex.io/upload/M/M/O/Y1VFWK8jEoJe.jpg)
Enter the following command to decompress the installation package.
tar -zxvf redis-shake-v2.0.3.tar.gz
![image desc](https://labex.io/upload/R/B/D/UbGeQrLjFcRn.jpg)
Enter the following command to enter the decompressed directory.
cd redis-shake-v2.0.3/
ls
![image desc](https://labex.io/upload/H/N/O/iFx7XG4PYA66.jpg)
Enter the command vim redis-shake.conf
to open the configuration file of Redis-shake.
Refer to the setting parameters below,
source.type = standalone
source.address = 127.0.0.1:6379
source.passwrod_raw =
source.auth_type = auth
![image desc](https://labex.io/upload/U/D/T/TBGzoc4IfK6J.jpg)
Please note to replace YOUR-ALI-REDIS-ADDR with your own Alibaba Cloud Redis intranet connection address
target.type = standalone
target.address = YOUR-ALI-REDIS-ADDR:6379
target.password_raw = labex:Aliyun-test
target.auth_type = auth
![image desc](https://labex.io/upload/A/X/A/3OAsCEJGmJ3C.jpg)
Enter the following command to start the migration task.
./redis-shake.linux -type=sync -conf=redis-shake.conf
![image desc](https://labex.io/upload/L/V/Q/Q1vgpgrO02s0.jpg)
As shown in the figure above, Redis-shake will migrate stock data first, and then to migrate incremental data.
Now you need to create the second command line to connect to ECS.
Enter the following command to connect to Alibaba Cloud redis. Please note to replace YOUR-ALI-REDIS-ADDR with your own Alibaba Cloud Redis intranet connection address
redis-cli -h YOUR-ALI-REDIS-ADDR -p 6379 -a Aliyun-test
dbsize
![image desc](https://labex.io/upload/L/V/K/47nEIqUSAskI.jpg)
You can see that the existing data has been migrated.
Now you need to create the third ECS command line.
Enter the following command to insert 1 million pieces of data into local Redis instance.
python3 data.py
![image desc](https://labex.io/upload/D/A/I/J24XIwRNmx56.jpg)
Go back to the first command line, you can see that the incremental data is migrating.
![image desc](https://labex.io/upload/W/A/W/tA1MHd9mP4os.jpg)
<font color='red'>Users can cut off the above result picture when they are doing the experiment and send it to the teacher, indicating that the current experiment has been completed.</font>
After the data insertion complete, enter the dbsize
command in the second command line, you can see that the newly added data has been migrated, and there are 2 million pieces of data in total.
![image desc](https://labex.io/upload/T/N/A/zGF2j2pv4KmN.jpg)
Reminder:
Before you leave this lab, remember to log out your Alibaba RAM account before you click the ‘stop’ button of your lab. Otherwise you’ll encounter some issues when opening a new lab session in the same browser:
![image desc](https://labex.io/upload/L/O/A/wNqHv3R4rO6f.png)
![image desc](https://labex.io/upload/G/Q/S/nohwU5ZhENyN.png)
6. Experiment summary
The experiment introduced the process of using the Redis-shake migration tool to migrate your local Redis inventory and incremental data to Alibaba Cloud Redis. ApsaraDB for Redis supports product configurations for products contain multiple memory specifications. Users can upgrade memory specifications according to business volume, and support elastic expansion of the storage space and throughput performance of the database system under the cluster architecture, breaking through the high QPS performance bottleneck of massive data, therefore, to deal with the requirements of read and write for millions pieces of data every second.