Skip to content

Commit

Permalink
Merge pull request #4457 from FederatedAI/develop-1.9.1
Browse files Browse the repository at this point in the history
Develop 1.9.1
  • Loading branch information
mgqa34 authored Nov 23, 2022
2 parents f6142c4 + 45a18b8 commit 9b54d9f
Show file tree
Hide file tree
Showing 28 changed files with 448 additions and 110 deletions.
6 changes: 3 additions & 3 deletions .gitmodules
Original file line number Diff line number Diff line change
@@ -1,12 +1,12 @@
[submodule "fateboard"]
path = fateboard
url = https://github.com/FederatedAI/FATE-Board.git
branch = v1.9.0
branch = v1.9.1
[submodule "eggroll"]
path = eggroll
url = https://github.com/WeBankFinTech/eggroll.git
branch = v2.4.5
branch = v2.4.6
[submodule "fateflow"]
path = fateflow
url = https://github.com/FederatedAI/FATE-Flow.git
branch = v1.9.0
branch = v1.9.1
10 changes: 10 additions & 0 deletions RELEASE.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,13 @@
## Release 1.9.1
### Major Features and Improvements
> Bug-Fix
* Fix cipher compression with large Hessian value for HeteroSecureBoost
* Fix tweedie-loss calculation in HeteroSecureBoost
* Fix Intersection summary when left-joining data with match_id
* Fix event/non_event statistic for WOE computation in HeteroFeatureBinning
* Fix default sid name display for data uploaded with meta


## Release 1.9.0
### Major Features and Improvements
> FederatedML
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -153,30 +153,30 @@ ssh [email protected]

**Upload below Packages to Servers**

1. wget https://webank-ai-1251170195.cos.ap-guangzhou.myqcloud.com/resources/jdk-8u192.tar.gz
2. wget https://archive.apache.org/dist/hadoop/common/hadoop-3.2.0/hadoop-3.2.0.tar.gz
1. wget https://webank-ai-1251170195.cos.ap-guangzhou.myqcloud.com/resources/jdk-8u345.tar.xz
2. wget https://archive.apache.org/dist/hadoop/common/hadoop-3.3.1/hadoop-3.3.1.tar.gz
3. wget https://downloads.lightbend.com/scala/2.12.10/scala-2.12.10.tgz
4. wget https://archive.apache.org/dist/spark/spark-3.1.2/spark-3.1.2-bin-hadoop3.2.tgz
5. wget https://archive.apache.org/dist/zookeeper/zookeeper-3.4.14/zookeeper-3.4.14.tar.gz
5. wget https://archive.apache.org/dist/zookeeper/zookeeper-3.6.3/apache-zookeeper-3.6.3-bin.tar.gz

**Extract**

```bash
tar xvf hadoop-3.2.0.tar.gz -C /data/projects/common
tar xvf hadoop-3.3.1.tar.gz -C /data/projects/common
tar xvf scala-2.12.10.tgz -C /data/projects/common
tar xvf spark-3.1.2-bin-hadoop3.2.tgz -C /data/projects/common
tar xvf zookeeper-3.4.14.tar.gz -C /data/projects/common
tar xvf jdk-8u192-linux-x64.tar.gz -C /data/projects/common/jdk
mv hadoop-3.2.0 hadoop
tar xJf jdk-8u345.tar.xz -C /data/projects/common/jdk
mv hadoop-3.3.1 hadoop
mv scala-2.12.10 scala
mv spark-3.1.2-bin-hadoop3.2 spark
mv zookeeper-3.4.14 zookeeper
mv apache-zookeeper-3.6.3-bin zookeeper
```

**Configure /etc/profile**

```bash
export JAVA_HOME=/data/projects/common/jdk/jdk-8u192
export JAVA_HOME=/data/projects/common/jdk/jdk-8u345
export PATH=$JAVA_HOME/bin:$PATH
export HADOOP_HOME=/data/projects/common/hadoop
export PATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin
Expand Down Expand Up @@ -226,7 +226,7 @@ cd /data/projects/common/hadoop/etc/hadoop
**In hadoop-env.sh、yarn-env.sh**
**Add**: export JAVA_HOME=/data/projects/common/jdk/jdk1.8.0_192
**Add**: export JAVA_HOME=/data/projects/common/jdk/jdk-8u345
**In /data/projects/common/Hadoop/etc/hadoop change `core-site.xml`, `hdfs-site.xml`, `mapred-site.xml`, `yarn-site.xml` configuration; change IP hostname & path depending on actual environment. Please refer below for an example**
Expand Down Expand Up @@ -631,7 +631,7 @@ spark.yarn.jars hdfs://fate-cluster/tmp/spark/jars/\*.jar
**Add to spark-env.sh: **
```
export JAVA_HOME=/data/projects/common/jdk/jdk-8u192
export JAVA_HOME=/data/projects/common/jdk/jdk-8u345
export SCALA_HOME=/data/projects/common/scala
export HADOOP_HOME=/data/projects/common/hadoop
export HADOOP_CONF_DIR=\$HADOOP_HOME/etc/hadoop
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -154,30 +154,30 @@ ssh [email protected]

**上传以下程序包到服务器上**

1. wget https://webank-ai-1251170195.cos.ap-guangzhou.myqcloud.com/resources/jdk-8u192.tar.gz
2. wget https://archive.apache.org/dist/hadoop/common/hadoop-3.2.0/hadoop-3.2.0.tar.gz
1. wget https://webank-ai-1251170195.cos.ap-guangzhou.myqcloud.com/resources/jdk-8u345.tar.xz
2. wget https://archive.apache.org/dist/hadoop/common/hadoop-3.3.1/hadoop-3.3.1.tar.gz
3. wget https://downloads.lightbend.com/scala/2.12.10/scala-2.12.10.tgz
4. wget https://archive.apache.org/dist/spark/spark-3.1.2/spark-3.1.2-bin-hadoop3.2.tgz
5. wget https://archive.apache.org/dist/zookeeper/zookeeper-3.4.14/zookeeper-3.4.14.tar.gz
5. wget https://archive.apache.org/dist/zookeeper/zookeeper-3.6.3/apache-zookeeper-3.6.3-bin.tar.gz

**解压**

```bash
tar xvf hadoop-3.2.0.tar.gz -C /data/projects/common
tar xvf hadoop-3.3.1.tar.gz -C /data/projects/common
tar xvf scala-2.12.10.tgz -C /data/projects/common
tar xvf spark-3.1.2-bin-hadoop3.2.tgz -C /data/projects/common
tar xvf zookeeper-3.4.14.tar.gz -C /data/projects/common
tar xvf jdk-8u192-linux-x64.tar.gz -C /data/projects/common/jdk
mv hadoop-3.2.0 hadoop
tar xJf jdk-8u345.tar.xz -C /data/projects/common/jdk
mv hadoop-3.3.1 hadoop
mv scala-2.12.10 scala
mv spark-3.1.2-bin-hadoop3.2 spark
mv zookeeper-3.4.14 zookeeper
mv apache-zookeeper-3.6.3-bin zookeeper
```

**配置/etc/profile**

```bash
export JAVA_HOME=/data/projects/common/jdk/jdk-8u192
export JAVA_HOME=/data/projects/common/jdk/jdk-8u345
export PATH=$JAVA_HOME/bin:$PATH
export HADOOP_HOME=/data/projects/common/hadoop
export PATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin
Expand Down Expand Up @@ -227,7 +227,7 @@ cd /data/projects/common/hadoop/etc/hadoop
**在hadoop-env.sh、yarn-env.sh**
**加入**:export JAVA_HOME=/data/projects/common/jdk/jdk1.8.0_192
**加入**:export JAVA_HOME=/data/projects/common/jdk/jdk-8u345
**/data/projects/common/Hadoop/etc/hadoop目录下修改core-site.xml、hdfs-site.xml、mapred-site.xml、yarn-site.xml配置,需要根据实际情况修改里面的IP主机名、目录等。参考如下**
Expand Down Expand Up @@ -629,7 +629,7 @@ spark.yarn.jars hdfs://fate-cluster/tmp/spark/jars/\*.jar
**在spark-env.sh加入**
export JAVA_HOME=/data/projects/common/jdk/jdk-8u192
export JAVA_HOME=/data/projects/common/jdk/jdk-8u345
export SCALA_HOME=/data/projects/common/scala
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@

```
1. jdk-8u192-linux-x64.tar.gz
wget https://webank-ai-1251170195.cos.ap-guangzhou.myqcloud.com/resources/jdk-8u192.tar.gz
wget https://webank-ai-1251170195.cos.ap-guangzhou.myqcloud.com/resources/jdk-8u345.tar.xz
2. spark-3.1.2-bin-hadoop3.2.tgz
wget https://archive.apache.org/dist/spark/spark-3.1.2/spark-3.1.2-bin-hadoop3.2.tgz
```
Expand All @@ -21,13 +21,13 @@ tar xvf spark-3.1.2-bin-hadoop3.2.tgz -C /data/projects/fate/common
#If JDK is not deployed in the current environment, execute
mkdir -p /data/projects/fate/common/jdk
#decompression
tar xzf jdk-8u192-linux-x64.tar.gz -C /data/projects/fate/common/jdk
tar xJf jdk-8u345.tar.xz -C /data/projects/fate/common/jdk
```

**configure /etc/profile**

```bash
export JAVA_HOME=/data/projects/fate/common/jdk/jdk-8u192
export JAVA_HOME=/data/projects/fate/common/jdk/jdk-8u345
export PATH=$JAVA_HOME/bin:$PATH
export SPARK_HOME=/data/projects/fate/common/spark-3.1.2-bin-hadoop3.2
export PATH=$SPARK_HOME/bin:$PATH
Expand All @@ -39,7 +39,7 @@ export PATH=$SPARK_HOME/bin:$PATH
cd /data/projects/fate/common/spark-3.1.2-bin-hadoop3.2/conf
cp spark-env.sh.template spark-env.sh
#Add parameters
export JAVA_HOME=/data/projects/fate/common/jdk/jdk-8u192
export JAVA_HOME=/data/projects/fate/common/jdk/jdk-8u345
export SPARK_MASTER_IP={Host IP}
export SPARK_MASTER_PORT=7077
export SPARK_MASTER_WEBUI_PORT=9080
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@

```
1. jdk-8u192-linux-x64.tar.gz
wget https://webank-ai-1251170195.cos.ap-guangzhou.myqcloud.com/resources/jdk-8u192.tar.gz
wget https://webank-ai-1251170195.cos.ap-guangzhou.myqcloud.com/resources/jdk-8u345.tar.xz
2. spark-3.1.2-bin-hadoop3.2.tgz
wget https://archive.apache.org/dist/spark/spark-3.1.2/spark-3.1.2-bin-hadoop3.2.tgz
```
Expand All @@ -21,13 +21,13 @@ tar xvf spark-3.1.2-bin-hadoop3.2.tgz -C /data/projects/fate/common
#如当前环境没有部署jdk则执行
mkdir -p /data/projects/fate/common/jdk
#解压缩
tar xzf jdk-8u192-linux-x64.tar.gz -C /data/projects/fate/common/jdk
tar xJf jdk-8u345.tar.xz -C /data/projects/fate/common/jdk
```

**配置/etc/profile**

```bash
export JAVA_HOME=/data/projects/fate/common/jdk/jdk-8u192
export JAVA_HOME=/data/projects/fate/common/jdk/jdk-8u345
export PATH=$JAVA_HOME/bin:$PATH
export SPARK_HOME=/data/projects/fate/common/spark-3.1.2-bin-hadoop3.2
export PATH=$SPARK_HOME/bin:$PATH
Expand All @@ -39,7 +39,7 @@ export PATH=$SPARK_HOME/bin:$PATH
cd /data/projects/fate/common/spark-3.1.2-bin-hadoop3.2/conf
cp spark-env.sh.template spark-env.sh
#增加参数
export JAVA_HOME=/data/projects/fate/common/jdk/jdk-8u192
export JAVA_HOME=/data/projects/fate/common/jdk/jdk-8u345
export SPARK_MASTER_IP={主机IP}
export SPARK_MASTER_PORT=7077
export SPARK_MASTER_WEBUI_PORT=9080
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -177,8 +177,8 @@ Execute on the target server (192.168.0.1 with extranet environment) under the a
mkdir -p /data/projects/install
cd /data/projects/install
wget https://webank-ai-1251170195.cos.ap-guangzhou.myqcloud.com/resources/Miniconda3-py38_4.12.0-Linux-x86_64.sh
wget https://webank-ai-1251170195.cos.ap-guangzhou.myqcloud.com/resources/jdk-8u192-linux-x64.tar.gz
wget https://webank-ai-1251170195.cos.ap-guangzhou.myqcloud.com/resources/mysql-8.0.28.tar.gz
wget https://webank-ai-1251170195.cos.ap-guangzhou.myqcloud.com/resources/jdk-8u345.tar.xz
wget https://webank-ai-1251170195.cos.ap-guangzhou.myqcloud.com/resources/mysql-fate-8.0.28.tar.gz
wget https://webank-ai-1251170195.cos.ap-guangzhou.myqcloud.com/resources/openresty-1.17.8.2.tar.gz
wget https://webank-ai-1251170195.cos.ap-guangzhou.myqcloud.com/fate/${version}/release/pip_packages_fate_${version}.tar.gz
wget https://webank-ai-1251170195.cos.ap-guangzhou.myqcloud.com/fate/${version}/release/fate_install_${version}_release.tar.gz
Expand Down Expand Up @@ -282,7 +282,7 @@ mysql>show databases;
mkdir -p /data/projects/fate/common/jdk
#Uncompress
cd /data/projects/install
tar xzf jdk-8u192-linux-x64.tar.gz -C /data/projects/fate/common/jdk
tar xJf jdk-8u345.tar.xz -C /data/projects/fate/common/jdk
```

### 5.5 Deploying python
Expand Down Expand Up @@ -345,7 +345,7 @@ export FATE_DEPLOY_BASE=\$fate_project_base
export PYTHONPATH=/data/projects/fate/fateflow/python:/data/projects/fate/fate/python
venv=/data/projects/fate/common/python/venv
export JAVA_HOME=/data/projects/fate/common/jdk/jdk-8u192
export JAVA_HOME=/data/projects/fate/common/jdk/jdk-8u345
export PATH=\$PATH:\$JAVA_HOME/bin
source \${venv}/bin/activate
export FATE_LOG_LEVEL=DEBUG
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -173,8 +173,8 @@ fi
mkdir -p /data/projects/install
cd /data/projects/install
wget https://webank-ai-1251170195.cos.ap-guangzhou.myqcloud.com/resources/Miniconda3-py38_4.12.0-Linux-x86_64.sh
wget https://webank-ai-1251170195.cos.ap-guangzhou.myqcloud.com/resources/jdk-8u192-linux-x64.tar.gz
wget https://webank-ai-1251170195.cos.ap-guangzhou.myqcloud.com/resources/mysql-8.0.28.tar.gz
wget https://webank-ai-1251170195.cos.ap-guangzhou.myqcloud.com/resources/jdk-8u345.tar.xz
wget https://webank-ai-1251170195.cos.ap-guangzhou.myqcloud.com/resources/mysql-fate-8.0.28.tar.gz
wget https://webank-ai-1251170195.cos.ap-guangzhou.myqcloud.com/resources/openresty-1.17.8.2.tar.gz
wget https://webank-ai-1251170195.cos.ap-guangzhou.myqcloud.com/fate/${version}/release/pip_packages_fate_${version}.tar.gz
wget https://webank-ai-1251170195.cos.ap-guangzhou.myqcloud.com/fate/${version}/release/fate_install_${version}_release.tar.gz
Expand Down Expand Up @@ -277,7 +277,7 @@ mysql>show databases;
mkdir -p /data/projects/fate/common/jdk
#解压缩
cd /data/projects/install
tar xzf jdk-8u192-linux-x64.tar.gz -C /data/projects/fate/common/jdk
tar xJf jdk-8u345.tar.xz -C /data/projects/fate/common/jdk
```

### 5.5 部署python
Expand Down Expand Up @@ -338,7 +338,7 @@ export FATE_DEPLOY_BASE=\$fate_project_base
export PYTHONPATH=/data/projects/fate/fateflow/python:/data/projects/fate/fate/python
venv=/data/projects/fate/common/python/venv
export JAVA_HOME=/data/projects/fate/common/jdk/jdk-8u192
export JAVA_HOME=/data/projects/fate/common/jdk/jdk-8u345
export PATH=\$PATH:\$JAVA_HOME/bin
source \${venv}/bin/activate
export FATE_LOG_LEVEL=DEBUG
Expand Down
75 changes: 70 additions & 5 deletions deploy/upgrade/sql/1.7.1-1.7.2.sql
Original file line number Diff line number Diff line change
Expand Up @@ -28,11 +28,76 @@ ALTER TABLE t_task DROP INDEX task_f_auto_retries;
ALTER TABLE t_task DROP INDEX task_f_worker_id;
ALTER TABLE t_task DROP INDEX task_f_party_status;

ALTER TABLE trackingmetric DROP INDEX trackingmetric_f_metric_namespace;
ALTER TABLE trackingmetric DROP INDEX trackingmetric_f_metric_name;
ALTER TABLE trackingmetric DROP INDEX trackingmetric_f_type;
DROP PROCEDURE IF EXISTS alter_trackingmetric;
DELIMITER //

CREATE PROCEDURE alter_trackingmetric()
BEGIN
DECLARE done BOOL DEFAULT FALSE;
DECLARE date_ CHAR(8);

DECLARE cur CURSOR FOR SELECT RIGHT(TABLE_NAME, 8) FROM INFORMATION_SCHEMA.TABLES
WHERE TABLE_SCHEMA = (SELECT DATABASE()) AND TABLE_NAME LIKE 't\_tracking\_metric\_%';
DECLARE CONTINUE HANDLER FOR NOT FOUND SET done = TRUE;

OPEN cur;

loop_: LOOP
FETCH cur INTO date_;
IF done THEN
LEAVE loop_;
END IF;

SET @sql = CONCAT(
'ALTER TABLE t_tracking_metric_', date_,
' DROP INDEX trackingmetric_', date_, '_f_metric_namespace,',
' DROP INDEX trackingmetric_', date_, '_f_metric_name,',
' DROP INDEX trackingmetric_', date_, '_f_type;'
);
PREPARE stmt FROM @sql;
EXECUTE stmt;
DEALLOCATE PREPARE stmt;
END LOOP;

CLOSE cur;
END //

DELIMITER ;
CALL alter_trackingmetric();
DROP PROCEDURE alter_trackingmetric;

ALTER TABLE trackingoutputdatainfo DROP INDEX trackingoutputdatainfo_f_task_version;
DROP PROCEDURE IF EXISTS alter_trackingoutputdatainfo;
DELIMITER //

CREATE PROCEDURE alter_trackingoutputdatainfo()
BEGIN
DECLARE done BOOL DEFAULT FALSE;
DECLARE date_ CHAR(8);

DECLARE cur CURSOR FOR SELECT RIGHT(TABLE_NAME, 8) FROM INFORMATION_SCHEMA.TABLES
WHERE TABLE_SCHEMA = (SELECT DATABASE()) AND TABLE_NAME LIKE 't\_tracking\_output\_data\_info\_%';
DECLARE CONTINUE HANDLER FOR NOT FOUND SET done = TRUE;

OPEN cur;

loop_: LOOP
FETCH cur INTO date_;
IF done THEN
LEAVE loop_;
END IF;

SET @sql = CONCAT('ALTER TABLE t_tracking_output_data_info_', date_, 'DROP INDEX trackingoutputdatainfo_', date_, '_f_task_version;');
PREPARE stmt FROM @sql;
EXECUTE stmt;
DEALLOCATE PREPARE stmt;
END LOOP;

CLOSE cur;
END //

DELIMITER ;
CALL alter_trackingoutputdatainfo();
DROP PROCEDURE alter_trackingoutputdatainfo;

ALTER TABLE t_machine_learning_model_info DROP INDEX machinelearningmodelinfo_f_role;
ALTER TABLE t_machine_learning_model_info DROP INDEX machinelearningmodelinfo_f_party_id;
Expand Down Expand Up @@ -65,7 +130,7 @@ BEGIN
LEAVE loop_;
END IF;

SET @sql = CONCAT('ALTER TABLE t_component_summary_', date_, ' DROP INDEX componentsummary_', date_, '_f_task_version');
SET @sql = CONCAT('ALTER TABLE t_component_summary_', date_, ' DROP INDEX componentsummary_', date_, '_f_task_version;');
PREPARE stmt FROM @sql;
EXECUTE stmt;
DEALLOCATE PREPARE stmt;
Expand Down
Loading

0 comments on commit 9b54d9f

Please sign in to comment.