Skip to content

Federated Machine Learning

[中文]

FederatedML includes implementation of many common machine learning algorithms on federated learning. All modules are developed in a decoupling modular approach to enhance scalability. Specifically, we provide:

  1. Federated Statistic: PSI, Union, Pearson Correlation, etc.
  2. Federated Information Retrieval: PIR(SIR) Based OT
  3. Federated Feature Engineering: Feature Sampling, Feature Binning, Feature Selection, etc.
  4. Federated Machine Learning Algorithms: LR, GBDT, DNN, TransferLearning, UnsupervisedLearning which support Heterogeneous and Homogeneous styles.
  5. Model Evaluation: Binary | Multiclass | Regression | Clustering Evaluation, Local vs Federated Comparison.
  6. Secure Protocol: Provides multiple security protocols for secure multi-party computing and interaction between participants.

Algorithm List

Algorithm Module Name Description Data Input Data Output Model Input Model Output
DataIO DataIO This component transforms user-uploaded data into Instance object(deprecate in FATe-v1.7, use DataTransform instead). Table, values are raw data. Transformed Table, values are data instance defined here DataIO Model
DataTransform DataTransform This component transforms user-uploaded data into Instance object. Table, values are raw data. Transformed Table, values are data instance defined here DataTransform Model
Intersect Intersection Compute intersect data set of multiple parties without leakage of difference set information. Mainly used in hetero scenario task. Table. Table with only common instance keys. Intersect Model
Federated Sampling FederatedSample Federated Sampling data so that its distribution become balance in each party.This module supports standalone and federated versions. Table Table of sampled data; both random and stratified sampling methods are supported.
Feature Scale FeatureScale module for feature scaling and standardization. Table,values are instances. Transformed Table. Transform factors like min/max, mean/std.
Hetero Feature Binning HeteroFeatureBinning With binning input data, calculates each column's iv and woe and transform data according to the binned information. Table, values are instances. Transformed Table. iv/woe, split points, event count, non-event count etc. of each column.
Homo Feature Binning HomoFeatureBinning Calculate quantile binning through multiple parties Table Transformed Table Split points of each column
OneHot Encoder OneHotEncoder Transfer a column into one-hot format. Table, values are instances. Transformed Table with new header. Feature-name mapping between original header and new header.
Hetero Feature Selection HeteroFeatureSelection Provide 5 types of filters. Each filters can select columns according to user config Table Transformed Table with new header and filtered data instance. If iv filters used, hetero_binning model is needed. Whether each column is filtered.
Union Union Combine multiple data tables into one. Tables. Table with combined values from input Tables.
Hetero-LR HeteroLR Build hetero logistic regression model through multiple parties. Table, values are instances Table, values are instances. Logistic Regression Model, consists of model-meta and model-param.
Local Baseline LocalBaseline Wrapper that runs sklearn(scikit-learn) Logistic Regression model with local data. Table, values are instances. Table, values are instances.
Hetero-LinR HeteroLinR Build hetero linear regression model through multiple parties. Table, values are instances. Table, values are instances. Linear Regression Model, consists of model-meta and model-param.
Hetero-Poisson HeteroPoisson Build hetero poisson regression model through multiple parties. Table, values are instances. Table, values are instances. Poisson Regression Model, consists of model-meta and model-param.
Homo-LR HomoLR Build homo logistic regression model through multiple parties. Table, values are instances. Table, values are instances. Logistic Regression Model, consists of model-meta and model-param.
Homo-NN HomoNN Build homo neural network model through multiple parties. Table, values are instances. Table, values are instances. Neural Network Model, consists of model-meta and model-param.
Hetero Secure Boosting HeteroSecureBoost Build hetero secure boosting model through multiple parties Table, values are instances. Table, values are instances. SecureBoost Model, consists of model-meta and model-param.
Hetero Fast Secure Boosting HeteroFastSecureBoost Build hetero secure boosting model through multiple parties in layered/mix manners. Table, values are instances. Table, values are instances. FastSecureBoost Model, consists of model-meta and model-param.
Hetero Secure Boost Feature Transformer SBTFeatureTransformer This component can encode sample using Hetero SBT leaf indices. Table, values are instances. Table, values are instances. SBT Transformer Model
Evaluation Evaluation Output the model evaluation metrics for user. Table(s), values are instances.
Hetero Pearson HeteroPearson Calculate hetero correlation of features from different parties. Table, values are instances.
Hetero-NN HeteroNN Build hetero neural network model. Table, values are instances. Table, values are instances. Hetero Neural Network Model, consists of model-meta and model-param.
Homo Secure Boosting HomoSecureBoost Build homo secure boosting model through multiple parties Table, values are instances. Table, values are instances. SecureBoost Model, consists of model-meta and model-param.
Homo OneHot Encoder HomoOneHotEncoder Build homo onehot encoder model through multiple parties. Table, values are instances. Transformed Table with new header. Feature-name mapping between original header and new header.
Hetero Data Split HeteroDataSplit Split one data table into 3 tables by given ratio or count Table, values are instances. 3 Tables, values are instance.
Homo Data Split HomoDataSplit Split one data table into 3 tables by given ratio or count Table, values are instances. 3 Tables, values are instance.
Column Expand ColumnExpand Add arbitrary number of columns with user-provided values. Table, values are raw data. Transformed Table with added column(s) and new header. Column Expand Model
Secure Information Retrieval SecureInformationRetrieval Securely retrieves information from host through oblivious transfer Table, values are instance Table, values are instance
Hetero Federated Transfer Learning FTL Build Hetero FTL Model Between 2 party Table, values are instance Hetero FTL Model
Hetero KMeans HeteroKMeans Build Hetero KMeans model through multiple parties Table, values are instance Table, values are instance; Arbier outputs 2 Tables Hetero KMeans Model
PSI PSI Compute PSI value of features between two table Table, values are instance PSI Results
Data Statistics DataStatistics This component will do some statistical work on the data, including statistical mean, maximum and minimum, median, etc. Table, values are instance Table Statistic Result
Scorecard Scorecard Scale predict score to credit score by given scaling parameters Table, values are predict score Table, values are score results
Sample Weight SampleWeight Assign weight to instances according to user-specified parameters Table, values are instance Table, values are weighted instance SampleWeight Model
Feldman Verifiable Sum FeldmanVerifiableSum This component will sum multiple privacy values without exposing data Table, values to sum Table, values are sum results
Feature Imputation FeatureImputation This component imputes missing features using arbitrary methods/values Table, values are Instances Table, values with missing features filled FeatureImputation Model
Label Transform LabelTransform Replaces label values of input data instances and predict results Table, values are Instances or prediction results Table, values with transformed label values LabelTransform Model
Hetero SSHE Logistic Regression HeteroSSHELR Build hetero logistic regression model without arbiter Table, values are Instances Table, values are Instances SSHE LR Model

Secure Protocol

Params

param special

__all__ special

Modules

base_param
BaseParam
Source code in federatedml/param/base_param.py
class BaseParam(metaclass=_StaticDefaultMeta):
    def __init__(self):
        pass

    def set_name(self, name: str):
        self._name = name
        return self

    def check(self):
        raise NotImplementedError("Parameter Object should have be check")

    @classmethod
    def _get_or_init_deprecated_params_set(cls):
        if not hasattr(cls, _DEPRECATED_PARAMS):
            setattr(cls, _DEPRECATED_PARAMS, set())
        return getattr(cls, _DEPRECATED_PARAMS)

    def _get_or_init_feeded_deprecated_params_set(self, conf=None):
        if not hasattr(self, _FEEDED_DEPRECATED_PARAMS):
            if conf is None:
                setattr(self, _FEEDED_DEPRECATED_PARAMS, set())
            else:
                setattr(
                    self,
                    _FEEDED_DEPRECATED_PARAMS,
                    set(conf[_FEEDED_DEPRECATED_PARAMS]),
                )
        return getattr(self, _FEEDED_DEPRECATED_PARAMS)

    def _get_or_init_user_feeded_params_set(self, conf=None):
        if not hasattr(self, _USER_FEEDED_PARAMS):
            if conf is None:
                setattr(self, _USER_FEEDED_PARAMS, set())
            else:
                setattr(self, _USER_FEEDED_PARAMS, set(conf[_USER_FEEDED_PARAMS]))
        return getattr(self, _USER_FEEDED_PARAMS)

    def get_user_feeded(self):
        return self._get_or_init_user_feeded_params_set()

    def get_feeded_deprecated_params(self):
        return self._get_or_init_feeded_deprecated_params_set()

    @property
    def _deprecated_params_set(self):
        return {name: True for name in self.get_feeded_deprecated_params()}

    def as_dict(self):
        def _recursive_convert_obj_to_dict(obj):
            ret_dict = {}
            for attr_name in list(obj.__dict__):
                # get attr
                attr = getattr(obj, attr_name)
                if attr and type(attr).__name__ not in dir(builtins):
                    ret_dict[attr_name] = _recursive_convert_obj_to_dict(attr)
                else:
                    ret_dict[attr_name] = attr

            return ret_dict

        return _recursive_convert_obj_to_dict(self)

    def update(self, conf, allow_redundant=False):
        update_from_raw_conf = conf.get(_IS_RAW_CONF, True)
        if update_from_raw_conf:
            deprecated_params_set = self._get_or_init_deprecated_params_set()
            feeded_deprecated_params_set = (
                self._get_or_init_feeded_deprecated_params_set()
            )
            user_feeded_params_set = self._get_or_init_user_feeded_params_set()
            setattr(self, _IS_RAW_CONF, False)
        else:
            feeded_deprecated_params_set = (
                self._get_or_init_feeded_deprecated_params_set(conf)
            )
            user_feeded_params_set = self._get_or_init_user_feeded_params_set(conf)

        def _recursive_update_param(param, config, depth, prefix):
            if depth > consts.PARAM_MAXDEPTH:
                raise ValueError("Param define nesting too deep!!!, can not parse it")

            inst_variables = param.__dict__
            redundant_attrs = []
            for config_key, config_value in config.items():
                # redundant attr
                if config_key not in inst_variables:
                    if not update_from_raw_conf and config_key.startswith("_"):
                        setattr(param, config_key, config_value)
                    else:
                        redundant_attrs.append(config_key)
                    continue

                full_config_key = f"{prefix}{config_key}"

                if update_from_raw_conf:
                    # add user feeded params
                    user_feeded_params_set.add(full_config_key)

                    # update user feeded deprecated param set
                    if full_config_key in deprecated_params_set:
                        feeded_deprecated_params_set.add(full_config_key)

                # supported attr
                attr = getattr(param, config_key)
                if type(attr).__name__ in dir(builtins) or attr is None:
                    setattr(param, config_key, config_value)

                else:
                    # recursive set obj attr
                    sub_params = _recursive_update_param(
                        attr, config_value, depth + 1, prefix=f"{prefix}{config_key}."
                    )
                    setattr(param, config_key, sub_params)

            if not allow_redundant and redundant_attrs:
                raise ValueError(
                    f"cpn `{getattr(self, '_name', type(self))}` has redundant parameters: `{[redundant_attrs]}`"
                )

            return param

        return _recursive_update_param(param=self, config=conf, depth=0, prefix="")

    def extract_not_builtin(self):
        def _get_not_builtin_types(obj):
            ret_dict = {}
            for variable in obj.__dict__:
                attr = getattr(obj, variable)
                if attr and type(attr).__name__ not in dir(builtins):
                    ret_dict[variable] = _get_not_builtin_types(attr)

            return ret_dict

        return _get_not_builtin_types(self)

    def validate(self):
        self.builtin_types = dir(builtins)
        self.func = {
            "ge": self._greater_equal_than,
            "le": self._less_equal_than,
            "in": self._in,
            "not_in": self._not_in,
            "range": self._range,
        }
        home_dir = os.path.abspath(os.path.dirname(os.path.realpath(__file__)))
        param_validation_path_prefix = home_dir + "/param_validation/"

        param_name = type(self).__name__
        param_validation_path = "/".join(
            [param_validation_path_prefix, param_name + ".json"]
        )

        validation_json = None

        try:
            with open(param_validation_path, "r") as fin:
                validation_json = json.loads(fin.read())
        except:
            return

        self._validate_param(self, validation_json)

    def _validate_param(self, param_obj, validation_json):
        default_section = type(param_obj).__name__
        var_list = param_obj.__dict__

        for variable in var_list:
            attr = getattr(param_obj, variable)

            if type(attr).__name__ in self.builtin_types or attr is None:
                if variable not in validation_json:
                    continue

                validation_dict = validation_json[default_section][variable]
                value = getattr(param_obj, variable)
                value_legal = False

                for op_type in validation_dict:
                    if self.func[op_type](value, validation_dict[op_type]):
                        value_legal = True
                        break

                if not value_legal:
                    raise ValueError(
                        "Plase check runtime conf, {} = {} does not match user-parameter restriction".format(
                            variable, value
                        )
                    )

            elif variable in validation_json:
                self._validate_param(attr, validation_json)

    @staticmethod
    def check_string(param, descr):
        if type(param).__name__ not in ["str"]:
            raise ValueError(
                descr + " {} not supported, should be string type".format(param)
            )

    @staticmethod
    def check_positive_integer(param, descr):
        if type(param).__name__ not in ["int", "long"] or param <= 0:
            raise ValueError(
                descr + " {} not supported, should be positive integer".format(param)
            )

    @staticmethod
    def check_positive_number(param, descr):
        if type(param).__name__ not in ["float", "int", "long"] or param <= 0:
            raise ValueError(
                descr + " {} not supported, should be positive numeric".format(param)
            )

    @staticmethod
    def check_nonnegative_number(param, descr):
        if type(param).__name__ not in ["float", "int", "long"] or param < 0:
            raise ValueError(
                descr
                + " {} not supported, should be non-negative numeric".format(param)
            )

    @staticmethod
    def check_decimal_float(param, descr):
        if type(param).__name__ not in ["float", "int"] or param < 0 or param > 1:
            raise ValueError(
                descr
                + " {} not supported, should be a float number in range [0, 1]".format(
                    param
                )
            )

    @staticmethod
    def check_boolean(param, descr):
        if type(param).__name__ != "bool":
            raise ValueError(
                descr + " {} not supported, should be bool type".format(param)
            )

    @staticmethod
    def check_open_unit_interval(param, descr):
        if type(param).__name__ not in ["float"] or param <= 0 or param >= 1:
            raise ValueError(
                descr + " should be a numeric number between 0 and 1 exclusively"
            )

    @staticmethod
    def check_valid_value(param, descr, valid_values):
        if param not in valid_values:
            raise ValueError(
                descr
                + " {} is not supported, it should be in {}".format(param, valid_values)
            )

    @staticmethod
    def check_defined_type(param, descr, types):
        if type(param).__name__ not in types:
            raise ValueError(
                descr + " {} not supported, should be one of {}".format(param, types)
            )

    @staticmethod
    def check_and_change_lower(param, valid_list, descr=""):
        if type(param).__name__ != "str":
            raise ValueError(
                descr
                + " {} not supported, should be one of {}".format(param, valid_list)
            )

        lower_param = param.lower()
        if lower_param in valid_list:
            return lower_param
        else:
            raise ValueError(
                descr
                + " {} not supported, should be one of {}".format(param, valid_list)
            )

    @staticmethod
    def _greater_equal_than(value, limit):
        return value >= limit - consts.FLOAT_ZERO

    @staticmethod
    def _less_equal_than(value, limit):
        return value <= limit + consts.FLOAT_ZERO

    @staticmethod
    def _range(value, ranges):
        in_range = False
        for left_limit, right_limit in ranges:
            if (
                left_limit - consts.FLOAT_ZERO
                <= value
                <= right_limit + consts.FLOAT_ZERO
            ):
                in_range = True
                break

        return in_range

    @staticmethod
    def _in(value, right_value_list):
        return value in right_value_list

    @staticmethod
    def _not_in(value, wrong_value_list):
        return value not in wrong_value_list

    def _warn_deprecated_param(self, param_name, descr):
        if self._deprecated_params_set.get(param_name):
            LOGGER.warning(
                f"{descr} {param_name} is deprecated and ignored in this version."
            )

    def _warn_to_deprecate_param(self, param_name, descr, new_param):
        if self._deprecated_params_set.get(param_name):
            LOGGER.warning(
                f"{descr} {param_name} will be deprecated in future release; "
                f"please use {new_param} instead."
            )
            return True
        return False
__init__(self) special
Source code in federatedml/param/base_param.py
def __init__(self):
    pass
set_name(self, name)
Source code in federatedml/param/base_param.py
def set_name(self, name: str):
    self._name = name
    return self
check(self)
Source code in federatedml/param/base_param.py
def check(self):
    raise NotImplementedError("Parameter Object should have be check")
get_user_feeded(self)
Source code in federatedml/param/base_param.py
def get_user_feeded(self):
    return self._get_or_init_user_feeded_params_set()
get_feeded_deprecated_params(self)
Source code in federatedml/param/base_param.py
def get_feeded_deprecated_params(self):
    return self._get_or_init_feeded_deprecated_params_set()
as_dict(self)
Source code in federatedml/param/base_param.py
def as_dict(self):
    def _recursive_convert_obj_to_dict(obj):
        ret_dict = {}
        for attr_name in list(obj.__dict__):
            # get attr
            attr = getattr(obj, attr_name)
            if attr and type(attr).__name__ not in dir(builtins):
                ret_dict[attr_name] = _recursive_convert_obj_to_dict(attr)
            else:
                ret_dict[attr_name] = attr

        return ret_dict

    return _recursive_convert_obj_to_dict(self)
update(self, conf, allow_redundant=False)
Source code in federatedml/param/base_param.py
def update(self, conf, allow_redundant=False):
    update_from_raw_conf = conf.get(_IS_RAW_CONF, True)
    if update_from_raw_conf:
        deprecated_params_set = self._get_or_init_deprecated_params_set()
        feeded_deprecated_params_set = (
            self._get_or_init_feeded_deprecated_params_set()
        )
        user_feeded_params_set = self._get_or_init_user_feeded_params_set()
        setattr(self, _IS_RAW_CONF, False)
    else:
        feeded_deprecated_params_set = (
            self._get_or_init_feeded_deprecated_params_set(conf)
        )
        user_feeded_params_set = self._get_or_init_user_feeded_params_set(conf)

    def _recursive_update_param(param, config, depth, prefix):
        if depth > consts.PARAM_MAXDEPTH:
            raise ValueError("Param define nesting too deep!!!, can not parse it")

        inst_variables = param.__dict__
        redundant_attrs = []
        for config_key, config_value in config.items():
            # redundant attr
            if config_key not in inst_variables:
                if not update_from_raw_conf and config_key.startswith("_"):
                    setattr(param, config_key, config_value)
                else:
                    redundant_attrs.append(config_key)
                continue

            full_config_key = f"{prefix}{config_key}"

            if update_from_raw_conf:
                # add user feeded params
                user_feeded_params_set.add(full_config_key)

                # update user feeded deprecated param set
                if full_config_key in deprecated_params_set:
                    feeded_deprecated_params_set.add(full_config_key)

            # supported attr
            attr = getattr(param, config_key)
            if type(attr).__name__ in dir(builtins) or attr is None:
                setattr(param, config_key, config_value)

            else:
                # recursive set obj attr
                sub_params = _recursive_update_param(
                    attr, config_value, depth + 1, prefix=f"{prefix}{config_key}."
                )
                setattr(param, config_key, sub_params)

        if not allow_redundant and redundant_attrs:
            raise ValueError(
                f"cpn `{getattr(self, '_name', type(self))}` has redundant parameters: `{[redundant_attrs]}`"
            )

        return param

    return _recursive_update_param(param=self, config=conf, depth=0, prefix="")
extract_not_builtin(self)
Source code in federatedml/param/base_param.py
def extract_not_builtin(self):
    def _get_not_builtin_types(obj):
        ret_dict = {}
        for variable in obj.__dict__:
            attr = getattr(obj, variable)
            if attr and type(attr).__name__ not in dir(builtins):
                ret_dict[variable] = _get_not_builtin_types(attr)

        return ret_dict

    return _get_not_builtin_types(self)
validate(self)
Source code in federatedml/param/base_param.py
def validate(self):
    self.builtin_types = dir(builtins)
    self.func = {
        "ge": self._greater_equal_than,
        "le": self._less_equal_than,
        "in": self._in,
        "not_in": self._not_in,
        "range": self._range,
    }
    home_dir = os.path.abspath(os.path.dirname(os.path.realpath(__file__)))
    param_validation_path_prefix = home_dir + "/param_validation/"

    param_name = type(self).__name__
    param_validation_path = "/".join(
        [param_validation_path_prefix, param_name + ".json"]
    )

    validation_json = None

    try:
        with open(param_validation_path, "r") as fin:
            validation_json = json.loads(fin.read())
    except:
        return

    self._validate_param(self, validation_json)
check_string(param, descr) staticmethod
Source code in federatedml/param/base_param.py
@staticmethod
def check_string(param, descr):
    if type(param).__name__ not in ["str"]:
        raise ValueError(
            descr + " {} not supported, should be string type".format(param)
        )
check_positive_integer(param, descr) staticmethod
Source code in federatedml/param/base_param.py
@staticmethod
def check_positive_integer(param, descr):
    if type(param).__name__ not in ["int", "long"] or param <= 0:
        raise ValueError(
            descr + " {} not supported, should be positive integer".format(param)
        )
check_positive_number(param, descr) staticmethod
Source code in federatedml/param/base_param.py
@staticmethod
def check_positive_number(param, descr):
    if type(param).__name__ not in ["float", "int", "long"] or param <= 0:
        raise ValueError(
            descr + " {} not supported, should be positive numeric".format(param)
        )
check_nonnegative_number(param, descr) staticmethod
Source code in federatedml/param/base_param.py
@staticmethod
def check_nonnegative_number(param, descr):
    if type(param).__name__ not in ["float", "int", "long"] or param < 0:
        raise ValueError(
            descr
            + " {} not supported, should be non-negative numeric".format(param)
        )
check_decimal_float(param, descr) staticmethod
Source code in federatedml/param/base_param.py
@staticmethod
def check_decimal_float(param, descr):
    if type(param).__name__ not in ["float", "int"] or param < 0 or param > 1:
        raise ValueError(
            descr
            + " {} not supported, should be a float number in range [0, 1]".format(
                param
            )
        )
check_boolean(param, descr) staticmethod
Source code in federatedml/param/base_param.py
@staticmethod
def check_boolean(param, descr):
    if type(param).__name__ != "bool":
        raise ValueError(
            descr + " {} not supported, should be bool type".format(param)
        )
check_open_unit_interval(param, descr) staticmethod
Source code in federatedml/param/base_param.py
@staticmethod
def check_open_unit_interval(param, descr):
    if type(param).__name__ not in ["float"] or param <= 0 or param >= 1:
        raise ValueError(
            descr + " should be a numeric number between 0 and 1 exclusively"
        )
check_valid_value(param, descr, valid_values) staticmethod
Source code in federatedml/param/base_param.py
@staticmethod
def check_valid_value(param, descr, valid_values):
    if param not in valid_values:
        raise ValueError(
            descr
            + " {} is not supported, it should be in {}".format(param, valid_values)
        )
check_defined_type(param, descr, types) staticmethod
Source code in federatedml/param/base_param.py
@staticmethod
def check_defined_type(param, descr, types):
    if type(param).__name__ not in types:
        raise ValueError(
            descr + " {} not supported, should be one of {}".format(param, types)
        )
check_and_change_lower(param, valid_list, descr='') staticmethod
Source code in federatedml/param/base_param.py
@staticmethod
def check_and_change_lower(param, valid_list, descr=""):
    if type(param).__name__ != "str":
        raise ValueError(
            descr
            + " {} not supported, should be one of {}".format(param, valid_list)
        )

    lower_param = param.lower()
    if lower_param in valid_list:
        return lower_param
    else:
        raise ValueError(
            descr
            + " {} not supported, should be one of {}".format(param, valid_list)
        )
deprecated_param(*names)
Source code in federatedml/param/base_param.py
def deprecated_param(*names):
    def _decorator(cls: "BaseParam"):
        deprecated = cls._get_or_init_deprecated_params_set()
        for name in names:
            deprecated.add(name)
        return cls

    return _decorator
boosting_param
hetero_deprecated_param_list
homo_deprecated_param_list
Classes
ObjectiveParam (BaseParam)

Define objective parameters that used in federated ml.

Parameters:

Name Type Description Default
objective {None, 'cross_entropy', 'lse', 'lae', 'log_cosh', 'tweedie', 'fair', 'huber'}

None in host's config, should be str in guest'config. when task_type is classification, only support 'cross_entropy', other 6 types support in regression task

'cross_entropy'
params None or list

should be non empty list when objective is 'tweedie','fair','huber', first element of list shoulf be a float-number large than 0.0 when objective is 'fair', 'huber', first element of list should be a float-number in [1.0, 2.0) when objective is 'tweedie'

None
Source code in federatedml/param/boosting_param.py
class ObjectiveParam(BaseParam):
    """
    Define objective parameters that used in federated ml.

    Parameters
    ----------
    objective : {None, 'cross_entropy', 'lse', 'lae', 'log_cosh', 'tweedie', 'fair', 'huber'}
        None in host's config, should be str in guest'config.
        when task_type is classification, only support 'cross_entropy',
        other 6 types support in regression task

    params : None or list
        should be non empty list when objective is 'tweedie','fair','huber',
        first element of list shoulf be a float-number large than 0.0 when objective is 'fair', 'huber',
        first element of list should be a float-number in [1.0, 2.0) when objective is 'tweedie'
    """

    def __init__(self, objective='cross_entropy', params=None):
        self.objective = objective
        self.params = params

    def check(self, task_type=None):
        if self.objective is None:
            return True

        descr = "objective param's"

        LOGGER.debug('check objective {}'.format(self.objective))

        if task_type not in [consts.CLASSIFICATION, consts.REGRESSION]:
            self.objective = self.check_and_change_lower(self.objective,
                                                   ["cross_entropy", "lse", "lae", "huber", "fair",
                                                    "log_cosh", "tweedie"],
                                                       descr)

        if task_type == consts.CLASSIFICATION:
            if self.objective != "cross_entropy":
                raise ValueError("objective param's objective {} not supported".format(self.objective))

        elif task_type == consts.REGRESSION:
            self.objective = self.check_and_change_lower(self.objective,
                                                               ["lse", "lae", "huber", "fair", "log_cosh", "tweedie"],
                                                               descr)

            params = self.params
            if self.objective in ["huber", "fair", "tweedie"]:
                if type(params).__name__ != 'list' or len(params) < 1:
                    raise ValueError(
                        "objective param's params {} not supported, should be non-empty list".format(params))

                if type(params[0]).__name__ not in ["float", "int", "long"]:
                    raise ValueError("objective param's params[0] {} not supported".format(self.params[0]))

                if self.objective == 'tweedie':
                    if params[0] < 1 or params[0] >= 2:
                        raise ValueError("in tweedie regression, objective params[0] should betweend [1, 2)")

                if self.objective == 'fair' or 'huber':
                    if params[0] <= 0.0:
                        raise ValueError("in {} regression, objective params[0] should greater than 0.0".format(
                            self.objective))
        return True
__init__(self, objective='cross_entropy', params=None) special
Source code in federatedml/param/boosting_param.py
def __init__(self, objective='cross_entropy', params=None):
    self.objective = objective
    self.params = params
check(self, task_type=None)
Source code in federatedml/param/boosting_param.py
def check(self, task_type=None):
    if self.objective is None:
        return True

    descr = "objective param's"

    LOGGER.debug('check objective {}'.format(self.objective))

    if task_type not in [consts.CLASSIFICATION, consts.REGRESSION]:
        self.objective = self.check_and_change_lower(self.objective,
                                               ["cross_entropy", "lse", "lae", "huber", "fair",
                                                "log_cosh", "tweedie"],
                                                   descr)

    if task_type == consts.CLASSIFICATION:
        if self.objective != "cross_entropy":
            raise ValueError("objective param's objective {} not supported".format(self.objective))

    elif task_type == consts.REGRESSION:
        self.objective = self.check_and_change_lower(self.objective,
                                                           ["lse", "lae", "huber", "fair", "log_cosh", "tweedie"],
                                                           descr)

        params = self.params
        if self.objective in ["huber", "fair", "tweedie"]:
            if type(params).__name__ != 'list' or len(params) < 1:
                raise ValueError(
                    "objective param's params {} not supported, should be non-empty list".format(params))

            if type(params[0]).__name__ not in ["float", "int", "long"]:
                raise ValueError("objective param's params[0] {} not supported".format(self.params[0]))

            if self.objective == 'tweedie':
                if params[0] < 1 or params[0] >= 2:
                    raise ValueError("in tweedie regression, objective params[0] should betweend [1, 2)")

            if self.objective == 'fair' or 'huber':
                if params[0] <= 0.0:
                    raise ValueError("in {} regression, objective params[0] should greater than 0.0".format(
                        self.objective))
    return True
DecisionTreeParam (BaseParam)

Define decision tree parameters that used in federated ml.

Parameters:

Name Type Description Default
criterion_method {"xgboost"}, default: "xgboost"

the criterion function to use

'xgboost'
criterion_params list or dict

should be non empty and elements are float-numbers, if a list is offered, the first one is l2 regularization value, and the second one is l1 regularization value. if a dict is offered, make sure it contains key 'l1', and 'l2'. l1, l2 regularization values are non-negative floats. default: [0.1, 0] or {'l1':0, 'l2':0,1}

[0.1, 0]
max_depth positive integer

the max depth of a decision tree, default: 3

3
min_sample_split int

least quantity of nodes to split, default: 2

2
min_impurity_split float

least gain of a single split need to reach, default: 1e-3

0.001
min_child_weight float

sum of hessian needed in child nodes. default is 0

0
min_leaf_node int

when samples no more than min_leaf_node, it becomes a leave, default: 1

1
max_split_nodes positive integer

we will use no more than max_split_nodes to parallel finding their splits in a batch, for memory consideration. default is 65536

65536
feature_importance_type {'split', 'gain'}

if is 'split', feature_importances calculate by feature split times, if is 'gain', feature_importances calculate by feature split gain. default: 'split'

'split'
use_missing bool

use missing value in training process or not. default: False

False
zero_as_missing bool

regard 0 as missing value or not, will be use only if use_missing=True, default: False

False
deterministic bool

ensure stability when computing histogram. Set this to true to ensure stable result when using same data and same parameter. But it may slow down computation.

False
Source code in federatedml/param/boosting_param.py
class DecisionTreeParam(BaseParam):
    """
    Define decision tree parameters that used in federated ml.

    Parameters
    ----------
    criterion_method : {"xgboost"}, default: "xgboost"
        the criterion function to use

    criterion_params: list or dict
        should be non empty and elements are float-numbers,
        if a list is offered, the first one is l2 regularization value, and the second one is
        l1 regularization value.
        if a dict is offered, make sure it contains key 'l1', and 'l2'.
        l1, l2 regularization values are non-negative floats.
        default: [0.1, 0] or {'l1':0, 'l2':0,1}

    max_depth: positive integer
        the max depth of a decision tree, default: 3

    min_sample_split: int
        least quantity of nodes to split, default: 2

    min_impurity_split: float
        least gain of a single split need to reach, default: 1e-3

    min_child_weight: float
        sum of hessian needed in child nodes. default is 0

    min_leaf_node: int
        when samples no more than min_leaf_node, it becomes a leave, default: 1

    max_split_nodes: positive integer
        we will use no more than max_split_nodes to
        parallel finding their splits in a batch, for memory consideration. default is 65536

    feature_importance_type: {'split', 'gain'}
        if is 'split', feature_importances calculate by feature split times,
        if is 'gain', feature_importances calculate by feature split gain.
        default: 'split'

    use_missing: bool
        use missing value in training process or not. default: False

    zero_as_missing: bool
        regard 0 as missing value or not,
        will be use only if use_missing=True, default: False

    deterministic: bool
        ensure stability when computing histogram. Set this to true to ensure stable result when using
        same data and same parameter. But it may slow down computation.

    """

    def __init__(self, criterion_method="xgboost", criterion_params=[0.1, 0], max_depth=3,
                 min_sample_split=2, min_impurity_split=1e-3, min_leaf_node=1,
                 max_split_nodes=consts.MAX_SPLIT_NODES, feature_importance_type="split",
                 n_iter_no_change=True, tol=0.001, min_child_weight=0,
                 use_missing=False, zero_as_missing=False, deterministic=False):

        super(DecisionTreeParam, self).__init__()

        self.criterion_method = criterion_method
        self.criterion_params = criterion_params
        self.max_depth = max_depth
        self.min_sample_split = min_sample_split
        self.min_impurity_split = min_impurity_split
        self.min_leaf_node = min_leaf_node
        self.min_child_weight = min_child_weight
        self.max_split_nodes = max_split_nodes
        self.feature_importance_type = feature_importance_type
        self.n_iter_no_change = n_iter_no_change
        self.tol = tol
        self.use_missing = use_missing
        self.zero_as_missing = zero_as_missing
        self.deterministic = deterministic

    def check(self):
        descr = "decision tree param"

        self.criterion_method = self.check_and_change_lower(self.criterion_method,
                                                             ["xgboost"],
                                                             descr)

        if len(self.criterion_params) == 0:
            raise ValueError("decisition tree param's criterio_params should be non empty")

        if type(self.criterion_params) == list:
            assert len(self.criterion_params) == 2, 'length of criterion_param should be 2: l1, l2 regularization ' \
                                                    'values are needed'
            self.check_nonnegative_number(self.criterion_params[0], 'l2 reg value')
            self.check_nonnegative_number(self.criterion_params[1], 'l1 reg value')

        elif type(self.criterion_params) == dict:
            assert 'l1' in self.criterion_params and 'l2' in self.criterion_params, 'l1 and l2 keys are needed in ' \
                                                                                    'criterion_params dict'
            self.criterion_params = [self.criterion_params['l2'], self.criterion_params['l1']]
        else:
            raise ValueError('criterion_params should be a dict or a list contains l1, l2 reg value')

        if type(self.max_depth).__name__ not in ["int", "long"]:
            raise ValueError("decision tree param's max_depth {} not supported, should be integer".format(
                self.max_depth))

        if self.max_depth < 1:
            raise ValueError("decision tree param's max_depth should be positive integer, no less than 1")

        if type(self.min_sample_split).__name__ not in ["int", "long"]:
            raise ValueError("decision tree param's min_sample_split {} not supported, should be integer".format(
                self.min_sample_split))

        if type(self.min_impurity_split).__name__ not in ["int", "long", "float"]:
            raise ValueError("decision tree param's min_impurity_split {} not supported, should be numeric".format(
                self.min_impurity_split))

        if type(self.min_leaf_node).__name__ not in ["int", "long"]:
            raise ValueError("decision tree param's min_leaf_node {} not supported, should be integer".format(
                self.min_leaf_node))

        if type(self.max_split_nodes).__name__ not in ["int", "long"] or self.max_split_nodes < 1:
            raise ValueError("decision tree param's max_split_nodes {} not supported, " + \
                             "should be positive integer between 1 and {}".format(self.max_split_nodes,
                                                                                  consts.MAX_SPLIT_NODES))

        if type(self.n_iter_no_change).__name__ != "bool":
            raise ValueError("decision tree param's n_iter_no_change {} not supported, should be bool type".format(
                self.n_iter_no_change))

        if type(self.tol).__name__ not in ["float", "int", "long"]:
            raise ValueError("decision tree param's tol {} not supported, should be numeric".format(self.tol))

        self.feature_importance_type = self.check_and_change_lower(self.feature_importance_type,
                                                                    ["split", "gain"],
                                                                    descr)

        self.check_nonnegative_number(self.min_child_weight, 'min_child_weight')
        self.check_boolean(self.deterministic, 'deterministic')

        return True
__init__(self, criterion_method='xgboost', criterion_params=[0.1, 0], max_depth=3, min_sample_split=2, min_impurity_split=0.001, min_leaf_node=1, max_split_nodes=65536, feature_importance_type='split', n_iter_no_change=True, tol=0.001, min_child_weight=0, use_missing=False, zero_as_missing=False, deterministic=False) special
Source code in federatedml/param/boosting_param.py
def __init__(self, criterion_method="xgboost", criterion_params=[0.1, 0], max_depth=3,
             min_sample_split=2, min_impurity_split=1e-3, min_leaf_node=1,
             max_split_nodes=consts.MAX_SPLIT_NODES, feature_importance_type="split",
             n_iter_no_change=True, tol=0.001, min_child_weight=0,
             use_missing=False, zero_as_missing=False, deterministic=False):

    super(DecisionTreeParam, self).__init__()

    self.criterion_method = criterion_method
    self.criterion_params = criterion_params
    self.max_depth = max_depth
    self.min_sample_split = min_sample_split
    self.min_impurity_split = min_impurity_split
    self.min_leaf_node = min_leaf_node
    self.min_child_weight = min_child_weight
    self.max_split_nodes = max_split_nodes
    self.feature_importance_type = feature_importance_type
    self.n_iter_no_change = n_iter_no_change
    self.tol = tol
    self.use_missing = use_missing
    self.zero_as_missing = zero_as_missing
    self.deterministic = deterministic
check(self)
Source code in federatedml/param/boosting_param.py
def check(self):
    descr = "decision tree param"

    self.criterion_method = self.check_and_change_lower(self.criterion_method,
                                                         ["xgboost"],
                                                         descr)

    if len(self.criterion_params) == 0:
        raise ValueError("decisition tree param's criterio_params should be non empty")

    if type(self.criterion_params) == list:
        assert len(self.criterion_params) == 2, 'length of criterion_param should be 2: l1, l2 regularization ' \
                                                'values are needed'
        self.check_nonnegative_number(self.criterion_params[0], 'l2 reg value')
        self.check_nonnegative_number(self.criterion_params[1], 'l1 reg value')

    elif type(self.criterion_params) == dict:
        assert 'l1' in self.criterion_params and 'l2' in self.criterion_params, 'l1 and l2 keys are needed in ' \
                                                                                'criterion_params dict'
        self.criterion_params = [self.criterion_params['l2'], self.criterion_params['l1']]
    else:
        raise ValueError('criterion_params should be a dict or a list contains l1, l2 reg value')

    if type(self.max_depth).__name__ not in ["int", "long"]:
        raise ValueError("decision tree param's max_depth {} not supported, should be integer".format(
            self.max_depth))

    if self.max_depth < 1:
        raise ValueError("decision tree param's max_depth should be positive integer, no less than 1")

    if type(self.min_sample_split).__name__ not in ["int", "long"]:
        raise ValueError("decision tree param's min_sample_split {} not supported, should be integer".format(
            self.min_sample_split))

    if type(self.min_impurity_split).__name__ not in ["int", "long", "float"]:
        raise ValueError("decision tree param's min_impurity_split {} not supported, should be numeric".format(
            self.min_impurity_split))

    if type(self.min_leaf_node).__name__ not in ["int", "long"]:
        raise ValueError("decision tree param's min_leaf_node {} not supported, should be integer".format(
            self.min_leaf_node))

    if type(self.max_split_nodes).__name__ not in ["int", "long"] or self.max_split_nodes < 1:
        raise ValueError("decision tree param's max_split_nodes {} not supported, " + \
                         "should be positive integer between 1 and {}".format(self.max_split_nodes,
                                                                              consts.MAX_SPLIT_NODES))

    if type(self.n_iter_no_change).__name__ != "bool":
        raise ValueError("decision tree param's n_iter_no_change {} not supported, should be bool type".format(
            self.n_iter_no_change))

    if type(self.tol).__name__ not in ["float", "int", "long"]:
        raise ValueError("decision tree param's tol {} not supported, should be numeric".format(self.tol))

    self.feature_importance_type = self.check_and_change_lower(self.feature_importance_type,
                                                                ["split", "gain"],
                                                                descr)

    self.check_nonnegative_number(self.min_child_weight, 'min_child_weight')
    self.check_boolean(self.deterministic, 'deterministic')

    return True
BoostingParam (BaseParam)

Basic parameter for Boosting Algorithms

Parameters:

Name Type Description Default
task_type {'classification', 'regression'}, default: 'classification'

task type

'classification'
objective_param ObjectiveParam Object, default: ObjectiveParam()

objective param

<federatedml.param.boosting_param.ObjectiveParam object at 0x7f27551eb850>
learning_rate float, int or long

the learning rate of secure boost. default: 0.3

0.3
num_trees int or float

the max number of boosting round. default: 5

5
subsample_feature_rate float

a float-number in [0, 1], default: 1.0

1
n_iter_no_change bool,

when True and residual error less than tol, tree building process will stop. default: True

True
bin_num positive integer greater than 1

bin number use in quantile. default: 32

32
validation_freqs None or positive integer or container object in python

Do validation in training process or Not. if equals None, will not do validation in train process; if equals positive integer, will validate data every validation_freqs epochs passes; if container object in python, will validate data if epochs belong to this container. e.g. validation_freqs = [10, 15], will validate data when epoch equals to 10 and 15. Default: None

None
Source code in federatedml/param/boosting_param.py
class BoostingParam(BaseParam):
    """
    Basic parameter for Boosting Algorithms

    Parameters
    ----------
    task_type : {'classification', 'regression'}, default: 'classification'
        task type

    objective_param : ObjectiveParam Object, default: ObjectiveParam()
        objective param

    learning_rate : float, int or long
        the learning rate of secure boost. default: 0.3

    num_trees : int or float
        the max number of boosting round. default: 5

    subsample_feature_rate : float
        a float-number in [0, 1], default: 1.0

    n_iter_no_change : bool,
        when True and residual error less than tol, tree building process will stop. default: True

    bin_num: positive integer greater than 1
        bin number use in quantile. default: 32

    validation_freqs: None or positive integer or container object in python
        Do validation in training process or Not.
        if equals None, will not do validation in train process;
        if equals positive integer, will validate data every validation_freqs epochs passes;
        if container object in python, will validate data if epochs belong to this container.
        e.g. validation_freqs = [10, 15], will validate data when epoch equals to 10 and 15.
        Default: None
        """

    def __init__(self,  task_type=consts.CLASSIFICATION,
                 objective_param=ObjectiveParam(),
                 learning_rate=0.3, num_trees=5, subsample_feature_rate=1, n_iter_no_change=True,
                 tol=0.0001, bin_num=32,
                 predict_param=PredictParam(), cv_param=CrossValidationParam(),
                 validation_freqs=None, metrics=None, random_seed=100,
                 binning_error=consts.DEFAULT_RELATIVE_ERROR):

        super(BoostingParam, self).__init__()

        self.task_type = task_type
        self.objective_param = copy.deepcopy(objective_param)
        self.learning_rate = learning_rate
        self.num_trees = num_trees
        self.subsample_feature_rate = subsample_feature_rate
        self.n_iter_no_change = n_iter_no_change
        self.tol = tol
        self.bin_num = bin_num
        self.predict_param = copy.deepcopy(predict_param)
        self.cv_param = copy.deepcopy(cv_param)
        self.validation_freqs = validation_freqs
        self.metrics = metrics
        self.random_seed = random_seed
        self.binning_error = binning_error

    def check(self):

        descr = "boosting tree param's"

        if self.task_type not in [consts.CLASSIFICATION, consts.REGRESSION]:
            raise ValueError("boosting_core tree param's task_type {} not supported, should be {} or {}".format(
                self.task_type, consts.CLASSIFICATION, consts.REGRESSION))

        self.objective_param.check(self.task_type)

        if type(self.learning_rate).__name__ not in ["float", "int", "long"]:
            raise ValueError("boosting_core tree param's learning_rate {} not supported, should be numeric".format(
                self.learning_rate))

        if type(self.subsample_feature_rate).__name__ not in ["float", "int", "long"] or \
                self.subsample_feature_rate < 0 or self.subsample_feature_rate > 1:
            raise ValueError("boosting_core tree param's subsample_feature_rate should be a numeric number between 0 and 1")

        if type(self.n_iter_no_change).__name__ != "bool":
            raise ValueError("boosting_core tree param's n_iter_no_change {} not supported, should be bool type".format(
                self.n_iter_no_change))

        if type(self.tol).__name__ not in ["float", "int", "long"]:
            raise ValueError("boosting_core tree param's tol {} not supported, should be numeric".format(self.tol))

        if type(self.bin_num).__name__ not in ["int", "long"] or self.bin_num < 2:
            raise ValueError(
                "boosting_core tree param's bin_num {} not supported, should be positive integer greater than 1".format(
                    self.bin_num))

        if self.validation_freqs is None:
            pass
        elif isinstance(self.validation_freqs, int):
            if self.validation_freqs < 1:
                raise ValueError("validation_freqs should be larger than 0 when it's integer")
        elif not isinstance(self.validation_freqs, collections.Container):
            raise ValueError("validation_freqs should be None or positive integer or container")

        if self.metrics is not None and not isinstance(self.metrics, list):
            raise ValueError("metrics should be a list")

        if self.random_seed is not None:
            assert type(self.random_seed) == int and self.random_seed >= 0, 'random seed must be an integer >= 0'

        self.check_decimal_float(self.binning_error, descr)

        return True
__init__(self, task_type='classification', objective_param=<federatedml.param.boosting_param.ObjectiveParam object at 0x7f27551eb850>, learning_rate=0.3, num_trees=5, subsample_feature_rate=1, n_iter_no_change=True, tol=0.0001, bin_num=32, predict_param=<federatedml.param.predict_param.PredictParam object at 0x7f27551eb950>, cv_param=<federatedml.param.cross_validation_param.CrossValidationParam object at 0x7f27551eba50>, validation_freqs=None, metrics=None, random_seed=100, binning_error=0.0001) special
Source code in federatedml/param/boosting_param.py
def __init__(self,  task_type=consts.CLASSIFICATION,
             objective_param=ObjectiveParam(),
             learning_rate=0.3, num_trees=5, subsample_feature_rate=1, n_iter_no_change=True,
             tol=0.0001, bin_num=32,
             predict_param=PredictParam(), cv_param=CrossValidationParam(),
             validation_freqs=None, metrics=None, random_seed=100,
             binning_error=consts.DEFAULT_RELATIVE_ERROR):

    super(BoostingParam, self).__init__()

    self.task_type = task_type
    self.objective_param = copy.deepcopy(objective_param)
    self.learning_rate = learning_rate
    self.num_trees = num_trees
    self.subsample_feature_rate = subsample_feature_rate
    self.n_iter_no_change = n_iter_no_change
    self.tol = tol
    self.bin_num = bin_num
    self.predict_param = copy.deepcopy(predict_param)
    self.cv_param = copy.deepcopy(cv_param)
    self.validation_freqs = validation_freqs
    self.metrics = metrics
    self.random_seed = random_seed
    self.binning_error = binning_error
check(self)
Source code in federatedml/param/boosting_param.py
def check(self):

    descr = "boosting tree param's"

    if self.task_type not in [consts.CLASSIFICATION, consts.REGRESSION]:
        raise ValueError("boosting_core tree param's task_type {} not supported, should be {} or {}".format(
            self.task_type, consts.CLASSIFICATION, consts.REGRESSION))

    self.objective_param.check(self.task_type)

    if type(self.learning_rate).__name__ not in ["float", "int", "long"]:
        raise ValueError("boosting_core tree param's learning_rate {} not supported, should be numeric".format(
            self.learning_rate))

    if type(self.subsample_feature_rate).__name__ not in ["float", "int", "long"] or \
            self.subsample_feature_rate < 0 or self.subsample_feature_rate > 1:
        raise ValueError("boosting_core tree param's subsample_feature_rate should be a numeric number between 0 and 1")

    if type(self.n_iter_no_change).__name__ != "bool":
        raise ValueError("boosting_core tree param's n_iter_no_change {} not supported, should be bool type".format(
            self.n_iter_no_change))

    if type(self.tol).__name__ not in ["float", "int", "long"]:
        raise ValueError("boosting_core tree param's tol {} not supported, should be numeric".format(self.tol))

    if type(self.bin_num).__name__ not in ["int", "long"] or self.bin_num < 2:
        raise ValueError(
            "boosting_core tree param's bin_num {} not supported, should be positive integer greater than 1".format(
                self.bin_num))

    if self.validation_freqs is None:
        pass
    elif isinstance(self.validation_freqs, int):
        if self.validation_freqs < 1:
            raise ValueError("validation_freqs should be larger than 0 when it's integer")
    elif not isinstance(self.validation_freqs, collections.Container):
        raise ValueError("validation_freqs should be None or positive integer or container")

    if self.metrics is not None and not isinstance(self.metrics, list):
        raise ValueError("metrics should be a list")

    if self.random_seed is not None:
        assert type(self.random_seed) == int and self.random_seed >= 0, 'random seed must be an integer >= 0'

    self.check_decimal_float(self.binning_error, descr)

    return True
HeteroBoostingParam (BoostingParam)

Parameters:

Name Type Description Default
encrypt_param EncodeParam Object

encrypt method use in secure boost, default: EncryptParam()

<federatedml.param.encrypt_param.EncryptParam object at 0x7f27551ebb10>
encrypted_mode_calculator_param EncryptedModeCalculatorParam object

the calculation mode use in secureboost, default: EncryptedModeCalculatorParam()

<federatedml.param.encrypted_mode_calculation_param.EncryptedModeCalculatorParam object at 0x7f27551ebbd0>
Source code in federatedml/param/boosting_param.py
class HeteroBoostingParam(BoostingParam):

    """
    Parameters
    ----------
    encrypt_param : EncodeParam Object
        encrypt method use in secure boost, default: EncryptParam()

    encrypted_mode_calculator_param: EncryptedModeCalculatorParam object
        the calculation mode use in secureboost,
        default: EncryptedModeCalculatorParam()
    """

    def __init__(self, task_type=consts.CLASSIFICATION,
                 objective_param=ObjectiveParam(),
                 learning_rate=0.3, num_trees=5, subsample_feature_rate=1, n_iter_no_change=True,
                 tol=0.0001, encrypt_param=EncryptParam(),
                 bin_num=32,
                 encrypted_mode_calculator_param=EncryptedModeCalculatorParam(),
                 predict_param=PredictParam(), cv_param=CrossValidationParam(),
                 validation_freqs=None, early_stopping_rounds=None, metrics=None, use_first_metric_only=False,
                 random_seed=100, binning_error=consts.DEFAULT_RELATIVE_ERROR):

        super(HeteroBoostingParam, self).__init__(task_type, objective_param, learning_rate, num_trees,
                                                  subsample_feature_rate, n_iter_no_change, tol, bin_num,
                                                  predict_param, cv_param, validation_freqs, metrics=metrics,
                                                  random_seed=random_seed,
                                                  binning_error=binning_error)

        self.encrypt_param = copy.deepcopy(encrypt_param)
        self.encrypted_mode_calculator_param = copy.deepcopy(encrypted_mode_calculator_param)
        self.early_stopping_rounds = early_stopping_rounds
        self.use_first_metric_only = use_first_metric_only

    def check(self):

        super(HeteroBoostingParam, self).check()
        self.encrypted_mode_calculator_param.check()
        self.encrypt_param.check()

        if self.early_stopping_rounds is None:
            pass
        elif isinstance(self.early_stopping_rounds, int):
            if self.early_stopping_rounds < 1:
                raise ValueError("early stopping rounds should be larger than 0 when it's integer")
            if self.validation_freqs is None:
                raise ValueError("validation freqs must be set when early stopping is enabled")

        if not isinstance(self.use_first_metric_only, bool):
            raise ValueError("use_first_metric_only should be a boolean")

        return True
__init__(self, task_type='classification', objective_param=<federatedml.param.boosting_param.ObjectiveParam object at 0x7f27551eba10>, learning_rate=0.3, num_trees=5, subsample_feature_rate=1, n_iter_no_change=True, tol=0.0001, encrypt_param=<federatedml.param.encrypt_param.EncryptParam object at 0x7f27551ebb10>, bin_num=32, encrypted_mode_calculator_param=<federatedml.param.encrypted_mode_calculation_param.EncryptedModeCalculatorParam object at 0x7f27551ebbd0>, predict_param=<federatedml.param.predict_param.PredictParam object at 0x7f27551eb9d0>, cv_param=<federatedml.param.cross_validation_param.CrossValidationParam object at 0x7f27551ebb50>, validation_freqs=None, early_stopping_rounds=None, metrics=None, use_first_metric_only=False, random_seed=100, binning_error=0.0001) special
Source code in federatedml/param/boosting_param.py
def __init__(self, task_type=consts.CLASSIFICATION,
             objective_param=ObjectiveParam(),
             learning_rate=0.3, num_trees=5, subsample_feature_rate=1, n_iter_no_change=True,
             tol=0.0001, encrypt_param=EncryptParam(),
             bin_num=32,
             encrypted_mode_calculator_param=EncryptedModeCalculatorParam(),
             predict_param=PredictParam(), cv_param=CrossValidationParam(),
             validation_freqs=None, early_stopping_rounds=None, metrics=None, use_first_metric_only=False,
             random_seed=100, binning_error=consts.DEFAULT_RELATIVE_ERROR):

    super(HeteroBoostingParam, self).__init__(task_type, objective_param, learning_rate, num_trees,
                                              subsample_feature_rate, n_iter_no_change, tol, bin_num,
                                              predict_param, cv_param, validation_freqs, metrics=metrics,
                                              random_seed=random_seed,
                                              binning_error=binning_error)

    self.encrypt_param = copy.deepcopy(encrypt_param)
    self.encrypted_mode_calculator_param = copy.deepcopy(encrypted_mode_calculator_param)
    self.early_stopping_rounds = early_stopping_rounds
    self.use_first_metric_only = use_first_metric_only
check(self)
Source code in federatedml/param/boosting_param.py
def check(self):

    super(HeteroBoostingParam, self).check()
    self.encrypted_mode_calculator_param.check()
    self.encrypt_param.check()

    if self.early_stopping_rounds is None:
        pass
    elif isinstance(self.early_stopping_rounds, int):
        if self.early_stopping_rounds < 1:
            raise ValueError("early stopping rounds should be larger than 0 when it's integer")
        if self.validation_freqs is None:
            raise ValueError("validation freqs must be set when early stopping is enabled")

    if not isinstance(self.use_first_metric_only, bool):
        raise ValueError("use_first_metric_only should be a boolean")

    return True
HeteroSecureBoostParam (HeteroBoostingParam)

Define boosting tree parameters that used in federated ml.

Parameters:

Name Type Description Default
task_type {'classification', 'regression'}, default: 'classification'

task type

'classification'
tree_param DecisionTreeParam

tree param

<federatedml.param.boosting_param.DecisionTreeParam object at 0x7f27551ebc90>
objective_param ObjectiveParam Object, default: ObjectiveParam()

objective param

<federatedml.param.boosting_param.ObjectiveParam object at 0x7f27551ebe10>
learning_rate float, int or long

the learning rate of secure boost. default: 0.3

0.3
num_trees int or float

the max number of trees to build. default: 5

5
subsample_feature_rate float

a float-number in [0, 1], default: 1.0

1.0
random_seed int

seed that controls all random functions

100
n_iter_no_change bool,

when True and residual error less than tol, tree building process will stop. default: True

True
encrypt_param EncodeParam Object

encrypt method use in secure boost, default: EncryptParam(), this parameter is only for hetero-secureboost

<federatedml.param.encrypt_param.EncryptParam object at 0x7f27551ebe90>
bin_num positive integer greater than 1

bin number use in quantile. default: 32

32
encrypted_mode_calculator_param EncryptedModeCalculatorParam object

the calculation mode use in secureboost, default: EncryptedModeCalculatorParam(), only for hetero-secureboost

<federatedml.param.encrypted_mode_calculation_param.EncryptedModeCalculatorParam object at 0x7f27551ebed0>
use_missing bool

use missing value in training process or not. default: False

False
zero_as_missing bool

regard 0 as missing value or not, will be use only if use_missing=True, default: False

False
validation_freqs None or positive integer or container object in python

Do validation in training process or Not. if equals None, will not do validation in train process; if equals positive integer, will validate data every validation_freqs epochs passes; if container object in python, will validate data if epochs belong to this container. e.g. validation_freqs = [10, 15], will validate data when epoch equals to 10 and 15. Default: None The default value is None, 1 is suggested. You can set it to a number larger than 1 in order to speed up training by skipping validation rounds. When it is larger than 1, a number which is divisible by "num_trees" is recommended, otherwise, you will miss the validation scores of last training iteration.

None
early_stopping_rounds integer larger than 0

will stop training if one metric of one validation data doesn’t improve in last early_stopping_round rounds, need to set validation freqs and will check early_stopping every at every validation epoch,

None
metrics list, default: []

Specify which metrics to be used when performing evaluation during training process. If set as empty, default metrics will be used. For regression tasks, default metrics are ['root_mean_squared_error', 'mean_absolute_error'], For binary-classificatiin tasks, default metrics are ['auc', 'ks']. For multi-classification tasks, default metrics are ['accuracy', 'precision', 'recall']

None
use_first_metric_only bool

use only the first metric for early stopping

False
complete_secure bool

if use complete_secure, when use complete secure, build first tree using only guest features

False
sparse_optimization bool

Available when encrypted method is 'iterativeAffine' An optimized mode for high-dimension, sparse data.

False
run_goss bool

activate Gradient-based One-Side Sampling, which selects large gradient and small gradient samples using top_rate and other_rate.

False
top_rate float

the retain ratio of large gradient data, used when run_goss is True

0.2
other_rate float

the retain ratio of small gradient data, used when run_goss is True

0.1
cipher_compress_error {None}

This param is now abandoned

None
cipher_compress bool

default is True, use cipher compressing to reduce computation cost and transfer cost

True
Source code in federatedml/param/boosting_param.py
class HeteroSecureBoostParam(HeteroBoostingParam):
    """
    Define boosting tree parameters that used in federated ml.

    Parameters
    ----------
    task_type : {'classification', 'regression'}, default: 'classification'
        task type

    tree_param : DecisionTreeParam Object, default: DecisionTreeParam()
        tree param

    objective_param : ObjectiveParam Object, default: ObjectiveParam()
        objective param

    learning_rate : float, int or long
        the learning rate of secure boost. default: 0.3

    num_trees : int or float
        the max number of trees to build. default: 5

    subsample_feature_rate : float
        a float-number in [0, 1], default: 1.0

    random_seed: int
        seed that controls all random functions

    n_iter_no_change : bool,
        when True and residual error less than tol, tree building process will stop. default: True

    encrypt_param : EncodeParam Object
        encrypt method use in secure boost, default: EncryptParam(), this parameter
        is only for hetero-secureboost

    bin_num: positive integer greater than 1
        bin number use in quantile. default: 32

    encrypted_mode_calculator_param: EncryptedModeCalculatorParam object
        the calculation mode use in secureboost, default: EncryptedModeCalculatorParam(), only for hetero-secureboost

    use_missing: bool
        use missing value in training process or not. default: False

    zero_as_missing: bool
        regard 0 as missing value or not, will be use only if use_missing=True, default: False

    validation_freqs: None or positive integer or container object in python
        Do validation in training process or Not.
        if equals None, will not do validation in train process;
        if equals positive integer, will validate data every validation_freqs epochs passes;
        if container object in python, will validate data if epochs belong to this container.
        e.g. validation_freqs = [10, 15], will validate data when epoch equals to 10 and 15.
        Default: None
        The default value is None, 1 is suggested. You can set it to a number larger than 1 in order to
        speed up training by skipping validation rounds. When it is larger than 1, a number which is
        divisible by "num_trees" is recommended, otherwise, you will miss the validation scores
        of last training iteration.

    early_stopping_rounds: integer larger than 0
        will stop training if one metric of one validation data
        doesn’t improve in last early_stopping_round rounds,
        need to set validation freqs and will check early_stopping every at every validation epoch,

    metrics: list, default: []
        Specify which metrics to be used when performing evaluation during training process.
        If set as empty, default metrics will be used. For regression tasks, default metrics are
        ['root_mean_squared_error', 'mean_absolute_error'], For binary-classificatiin tasks, default metrics
        are ['auc', 'ks']. For multi-classification tasks, default metrics are ['accuracy', 'precision', 'recall']

    use_first_metric_only: bool
        use only the first metric for early stopping

    complete_secure: bool
        if use complete_secure, when use complete secure, build first tree using only guest features

    sparse_optimization: bool
        Available when encrypted method is 'iterativeAffine'
        An optimized mode for high-dimension, sparse data.

    run_goss: bool
        activate Gradient-based One-Side Sampling, which selects large gradient and small
        gradient samples using top_rate and other_rate.

    top_rate: float
        the retain ratio of large gradient data, used when run_goss is True

    other_rate: float
        the retain ratio of small gradient data, used when run_goss is True

    cipher_compress_error: {None}
        This param is now abandoned

    cipher_compress: bool
        default is True, use cipher compressing to reduce computation cost and transfer cost

    """

    def __init__(self, tree_param: DecisionTreeParam = DecisionTreeParam(), task_type=consts.CLASSIFICATION,
                 objective_param=ObjectiveParam(),
                 learning_rate=0.3, num_trees=5, subsample_feature_rate=1.0, n_iter_no_change=True,
                 tol=0.0001, encrypt_param=EncryptParam(),
                 bin_num=32,
                 encrypted_mode_calculator_param=EncryptedModeCalculatorParam(),
                 predict_param=PredictParam(), cv_param=CrossValidationParam(),
                 validation_freqs=None, early_stopping_rounds=None, use_missing=False, zero_as_missing=False,
                 complete_secure=False, metrics=None, use_first_metric_only=False, random_seed=100,
                 binning_error=consts.DEFAULT_RELATIVE_ERROR,
                 sparse_optimization=False, run_goss=False, top_rate=0.2, other_rate=0.1,
                 cipher_compress_error=None, cipher_compress=True, new_ver=True,
                 callback_param=CallbackParam()):

        super(HeteroSecureBoostParam, self).__init__(task_type, objective_param, learning_rate, num_trees,
                                                     subsample_feature_rate, n_iter_no_change, tol, encrypt_param,
                                                     bin_num, encrypted_mode_calculator_param, predict_param, cv_param,
                                                     validation_freqs, early_stopping_rounds, metrics=metrics,
                                                     use_first_metric_only=use_first_metric_only,
                                                     random_seed=random_seed,
                                                     binning_error=binning_error)

        self.tree_param = copy.deepcopy(tree_param)
        self.zero_as_missing = zero_as_missing
        self.use_missing = use_missing
        self.complete_secure = complete_secure
        self.sparse_optimization = sparse_optimization
        self.run_goss = run_goss
        self.top_rate = top_rate
        self.other_rate = other_rate
        self.cipher_compress_error = cipher_compress_error
        self.cipher_compress = cipher_compress
        self.new_ver = new_ver
        self.callback_param = copy.deepcopy(callback_param)

    def check(self):

        super(HeteroSecureBoostParam, self).check()
        self.tree_param.check()
        if type(self.use_missing) != bool:
            raise ValueError('use missing should be bool type')
        if type(self.zero_as_missing) != bool:
            raise ValueError('zero as missing should be bool type')
        self.check_boolean(self.complete_secure, 'complete_secure')
        self.check_boolean(self.sparse_optimization, 'sparse optimization')
        self.check_boolean(self.run_goss, 'run goss')
        self.check_decimal_float(self.top_rate, 'top rate')
        self.check_decimal_float(self.other_rate, 'other rate')
        self.check_positive_number(self.other_rate, 'other_rate')
        self.check_positive_number(self.top_rate, 'top_rate')
        self.check_boolean(self.new_ver, 'code version switcher')
        self.check_boolean(self.cipher_compress, 'cipher compress')

        for p in ["early_stopping_rounds", "validation_freqs", "metrics",
                  "use_first_metric_only"]:
            # if self._warn_to_deprecate_param(p, "", ""):
            if self._deprecated_params_set.get(p):
                if "callback_param" in self.get_user_feeded():
                    raise ValueError(f"{p} and callback param should not be set simultaneously,"
                                     f"{self._deprecated_params_set}, {self.get_user_feeded()}")
                else:
                    self.callback_param.callbacks = ["PerformanceEvaluate"]
                break

        descr = "boosting_param's"

        if self._warn_to_deprecate_param("validation_freqs", descr, "callback_param's 'validation_freqs'"):
            self.callback_param.validation_freqs = self.validation_freqs

        if self._warn_to_deprecate_param("early_stopping_rounds", descr, "callback_param's 'early_stopping_rounds'"):
            self.callback_param.early_stopping_rounds = self.early_stopping_rounds

        if self._warn_to_deprecate_param("metrics", descr, "callback_param's 'metrics'"):
            self.callback_param.metrics = self.metrics

        if self._warn_to_deprecate_param("use_first_metric_only", descr, "callback_param's 'use_first_metric_only'"):
            self.callback_param.use_first_metric_only = self.use_first_metric_only

        if self.top_rate + self.other_rate >= 1:
            raise ValueError('sum of top rate and other rate should be smaller than 1')

        if self.sparse_optimization and self.cipher_compress:
            raise ValueError('cipher compress is not supported in sparse optimization mode')

        return True
__init__(self, tree_param=<federatedml.param.boosting_param.DecisionTreeParam object at 0x7f27551ebc90>, task_type='classification', objective_param=<federatedml.param.boosting_param.ObjectiveParam object at 0x7f27551ebe10>, learning_rate=0.3, num_trees=5, subsample_feature_rate=1.0, n_iter_no_change=True, tol=0.0001, encrypt_param=<federatedml.param.encrypt_param.EncryptParam object at 0x7f27551ebe90>, bin_num=32, encrypted_mode_calculator_param=<federatedml.param.encrypted_mode_calculation_param.EncryptedModeCalculatorParam object at 0x7f27551ebed0>, predict_param=<federatedml.param.predict_param.PredictParam object at 0x7f27551ebe50>, cv_param=<federatedml.param.cross_validation_param.CrossValidationParam object at 0x7f27551ebd10>, validation_freqs=None, early_stopping_rounds=None, use_missing=False, zero_as_missing=False, complete_secure=False, metrics=None, use_first_metric_only=False, random_seed=100, binning_error=0.0001, sparse_optimization=False, run_goss=False, top_rate=0.2, other_rate=0.1, cipher_compress_error=None, cipher_compress=True, new_ver=True, callback_param=<federatedml.param.callback_param.CallbackParam object at 0x7f27551ebfd0>) special
Source code in federatedml/param/boosting_param.py
def __init__(self, tree_param: DecisionTreeParam = DecisionTreeParam(), task_type=consts.CLASSIFICATION,
             objective_param=ObjectiveParam(),
             learning_rate=0.3, num_trees=5, subsample_feature_rate=1.0, n_iter_no_change=True,
             tol=0.0001, encrypt_param=EncryptParam(),
             bin_num=32,
             encrypted_mode_calculator_param=EncryptedModeCalculatorParam(),
             predict_param=PredictParam(), cv_param=CrossValidationParam(),
             validation_freqs=None, early_stopping_rounds=None, use_missing=False, zero_as_missing=False,
             complete_secure=False, metrics=None, use_first_metric_only=False, random_seed=100,
             binning_error=consts.DEFAULT_RELATIVE_ERROR,
             sparse_optimization=False, run_goss=False, top_rate=0.2, other_rate=0.1,
             cipher_compress_error=None, cipher_compress=True, new_ver=True,
             callback_param=CallbackParam()):

    super(HeteroSecureBoostParam, self).__init__(task_type, objective_param, learning_rate, num_trees,
                                                 subsample_feature_rate, n_iter_no_change, tol, encrypt_param,
                                                 bin_num, encrypted_mode_calculator_param, predict_param, cv_param,
                                                 validation_freqs, early_stopping_rounds, metrics=metrics,
                                                 use_first_metric_only=use_first_metric_only,
                                                 random_seed=random_seed,
                                                 binning_error=binning_error)

    self.tree_param = copy.deepcopy(tree_param)
    self.zero_as_missing = zero_as_missing
    self.use_missing = use_missing
    self.complete_secure = complete_secure
    self.sparse_optimization = sparse_optimization
    self.run_goss = run_goss
    self.top_rate = top_rate
    self.other_rate = other_rate
    self.cipher_compress_error = cipher_compress_error
    self.cipher_compress = cipher_compress
    self.new_ver = new_ver
    self.callback_param = copy.deepcopy(callback_param)
check(self)
Source code in federatedml/param/boosting_param.py
def check(self):

    super(HeteroSecureBoostParam, self).check()
    self.tree_param.check()
    if type(self.use_missing) != bool:
        raise ValueError('use missing should be bool type')
    if type(self.zero_as_missing) != bool:
        raise ValueError('zero as missing should be bool type')
    self.check_boolean(self.complete_secure, 'complete_secure')
    self.check_boolean(self.sparse_optimization, 'sparse optimization')
    self.check_boolean(self.run_goss, 'run goss')
    self.check_decimal_float(self.top_rate, 'top rate')
    self.check_decimal_float(self.other_rate, 'other rate')
    self.check_positive_number(self.other_rate, 'other_rate')
    self.check_positive_number(self.top_rate, 'top_rate')
    self.check_boolean(self.new_ver, 'code version switcher')
    self.check_boolean(self.cipher_compress, 'cipher compress')

    for p in ["early_stopping_rounds", "validation_freqs", "metrics",
              "use_first_metric_only"]:
        # if self._warn_to_deprecate_param(p, "", ""):
        if self._deprecated_params_set.get(p):
            if "callback_param" in self.get_user_feeded():
                raise ValueError(f"{p} and callback param should not be set simultaneously,"
                                 f"{self._deprecated_params_set}, {self.get_user_feeded()}")
            else:
                self.callback_param.callbacks = ["PerformanceEvaluate"]
            break

    descr = "boosting_param's"

    if self._warn_to_deprecate_param("validation_freqs", descr, "callback_param's 'validation_freqs'"):
        self.callback_param.validation_freqs = self.validation_freqs

    if self._warn_to_deprecate_param("early_stopping_rounds", descr, "callback_param's 'early_stopping_rounds'"):
        self.callback_param.early_stopping_rounds = self.early_stopping_rounds

    if self._warn_to_deprecate_param("metrics", descr, "callback_param's 'metrics'"):
        self.callback_param.metrics = self.metrics

    if self._warn_to_deprecate_param("use_first_metric_only", descr, "callback_param's 'use_first_metric_only'"):
        self.callback_param.use_first_metric_only = self.use_first_metric_only

    if self.top_rate + self.other_rate >= 1:
        raise ValueError('sum of top rate and other rate should be smaller than 1')

    if self.sparse_optimization and self.cipher_compress:
        raise ValueError('cipher compress is not supported in sparse optimization mode')

    return True
HeteroFastSecureBoostParam (HeteroSecureBoostParam)
Source code in federatedml/param/boosting_param.py
class HeteroFastSecureBoostParam(HeteroSecureBoostParam):

    def __init__(self, tree_param: DecisionTreeParam = DecisionTreeParam(), task_type=consts.CLASSIFICATION,
                 objective_param=ObjectiveParam(),
                 learning_rate=0.3, num_trees=5, subsample_feature_rate=1, n_iter_no_change=True,
                 tol=0.0001, encrypt_param=EncryptParam(),
                 bin_num=32,
                 encrypted_mode_calculator_param=EncryptedModeCalculatorParam(),
                 predict_param=PredictParam(), cv_param=CrossValidationParam(),
                 validation_freqs=None, early_stopping_rounds=None, use_missing=False, zero_as_missing=False,
                 complete_secure=False, tree_num_per_party=1, guest_depth=1, host_depth=1, work_mode='mix', metrics=None,
                 sparse_optimization=False, random_seed=100, binning_error=consts.DEFAULT_RELATIVE_ERROR,
                 cipher_compress_error=None, new_ver=True, run_goss=False, top_rate=0.2, other_rate=0.1,
                 cipher_compress=True, callback_param=CallbackParam()):

        """
        Parameters
        ----------
        work_mode: {"mix", "layered"}
            mix:  alternate using guest/host features to build trees. For example, the first 'tree_num_per_party' trees use guest features,
                  the second k trees use host features, and so on
            layered: only support 2 party, when running layered mode, first 'host_depth' layer will use host features,
                     and then next 'guest_depth' will only use guest features
        tree_num_per_party: int
            every party will alternate build 'tree_num_per_party' trees until reach max tree num, this param is valid when work_mode is mix
        guest_depth: int
            guest will build last guest_depth of a decision tree using guest features, is valid when work mode is layered
        host depth: int
            host will build first host_depth of a decision tree using host features, is valid when work mode is layered

        """

        super(HeteroFastSecureBoostParam, self).__init__(tree_param, task_type, objective_param, learning_rate,
                                                         num_trees, subsample_feature_rate, n_iter_no_change, tol,
                                                         encrypt_param, bin_num, encrypted_mode_calculator_param,
                                                         predict_param, cv_param, validation_freqs, early_stopping_rounds,
                                                         use_missing, zero_as_missing, complete_secure, metrics=metrics,
                                                         random_seed=random_seed,
                                                         sparse_optimization=sparse_optimization,
                                                         binning_error=binning_error,
                                                         cipher_compress_error=cipher_compress_error,
                                                         new_ver=new_ver,
                                                         cipher_compress=cipher_compress,
                                                         run_goss=run_goss, top_rate=top_rate, other_rate=other_rate,
                                                         )

        self.tree_num_per_party = tree_num_per_party
        self.guest_depth = guest_depth
        self.host_depth = host_depth
        self.work_mode = work_mode
        self.callback_param = copy.deepcopy(callback_param)

    def check(self):

        super(HeteroFastSecureBoostParam, self).check()
        if type(self.guest_depth).__name__ not in ["int", "long"] or self.guest_depth <= 0:
            raise ValueError("guest_depth should be larger than 0")
        if type(self.host_depth).__name__ not in ["int", "long"] or self.host_depth <= 0:
            raise ValueError("host_depth should be larger than 0")
        if type(self.tree_num_per_party).__name__ not in ["int", "long"] or self.tree_num_per_party <= 0:
            raise ValueError("tree_num_per_party should be larger than 0")

        work_modes = [consts.MIX_TREE, consts.LAYERED_TREE]
        if self.work_mode not in work_modes:
            raise ValueError('only work_modes: {} are supported, input work mode is {}'.
                             format(work_modes, self.work_mode))

        return True
Methods
__init__(self, tree_param=<federatedml.param.boosting_param.DecisionTreeParam object at 0x7f27551ebf50>, task_type='classification', objective_param=<federatedml.param.boosting_param.ObjectiveParam object at 0x7f27551ebd90>, learning_rate=0.3, num_trees=5, subsample_feature_rate=1, n_iter_no_change=True, tol=0.0001, encrypt_param=<federatedml.param.encrypt_param.EncryptParam object at 0x7f27551ebf10>, bin_num=32, encrypted_mode_calculator_param=<federatedml.param.encrypted_mode_calculation_param.EncryptedModeCalculatorParam object at 0x7f27552bb0d0>, predict_param=<federatedml.param.predict_param.PredictParam object at 0x7f27552bb110>, cv_param=<federatedml.param.cross_validation_param.CrossValidationParam object at 0x7f27552bb090>, validation_freqs=None, early_stopping_rounds=None, use_missing=False, zero_as_missing=False, complete_secure=False, tree_num_per_party=1, guest_depth=1, host_depth=1, work_mode='mix', metrics=None, sparse_optimization=False, random_seed=100, binning_error=0.0001, cipher_compress_error=None, new_ver=True, run_goss=False, top_rate=0.2, other_rate=0.1, cipher_compress=True, callback_param=<federatedml.param.callback_param.CallbackParam object at 0x7f27552bb250>) special

Parameters:

Name Type Description Default
work_mode {"mix", "layered"} 'mix'
tree_num_per_party int

every party will alternate build 'tree_num_per_party' trees until reach max tree num, this param is valid when work_mode is mix

1
guest_depth int

guest will build last guest_depth of a decision tree using guest features, is valid when work mode is layered

1
host depth int

host will build first host_depth of a decision tree using host features, is valid when work mode is layered

required
Source code in federatedml/param/boosting_param.py
def __init__(self, tree_param: DecisionTreeParam = DecisionTreeParam(), task_type=consts.CLASSIFICATION,
             objective_param=ObjectiveParam(),
             learning_rate=0.3, num_trees=5, subsample_feature_rate=1, n_iter_no_change=True,
             tol=0.0001, encrypt_param=EncryptParam(),
             bin_num=32,
             encrypted_mode_calculator_param=EncryptedModeCalculatorParam(),
             predict_param=PredictParam(), cv_param=CrossValidationParam(),
             validation_freqs=None, early_stopping_rounds=None, use_missing=False, zero_as_missing=False,
             complete_secure=False, tree_num_per_party=1, guest_depth=1, host_depth=1, work_mode='mix', metrics=None,
             sparse_optimization=False, random_seed=100, binning_error=consts.DEFAULT_RELATIVE_ERROR,
             cipher_compress_error=None, new_ver=True, run_goss=False, top_rate=0.2, other_rate=0.1,
             cipher_compress=True, callback_param=CallbackParam()):

    """
    Parameters
    ----------
    work_mode: {"mix", "layered"}
        mix:  alternate using guest/host features to build trees. For example, the first 'tree_num_per_party' trees use guest features,
              the second k trees use host features, and so on
        layered: only support 2 party, when running layered mode, first 'host_depth' layer will use host features,
                 and then next 'guest_depth' will only use guest features
    tree_num_per_party: int
        every party will alternate build 'tree_num_per_party' trees until reach max tree num, this param is valid when work_mode is mix
    guest_depth: int
        guest will build last guest_depth of a decision tree using guest features, is valid when work mode is layered
    host depth: int
        host will build first host_depth of a decision tree using host features, is valid when work mode is layered

    """

    super(HeteroFastSecureBoostParam, self).__init__(tree_param, task_type, objective_param, learning_rate,
                                                     num_trees, subsample_feature_rate, n_iter_no_change, tol,
                                                     encrypt_param, bin_num, encrypted_mode_calculator_param,
                                                     predict_param, cv_param, validation_freqs, early_stopping_rounds,
                                                     use_missing, zero_as_missing, complete_secure, metrics=metrics,
                                                     random_seed=random_seed,
                                                     sparse_optimization=sparse_optimization,
                                                     binning_error=binning_error,
                                                     cipher_compress_error=cipher_compress_error,
                                                     new_ver=new_ver,
                                                     cipher_compress=cipher_compress,
                                                     run_goss=run_goss, top_rate=top_rate, other_rate=other_rate,
                                                     )

    self.tree_num_per_party = tree_num_per_party
    self.guest_depth = guest_depth
    self.host_depth = host_depth
    self.work_mode = work_mode
    self.callback_param = copy.deepcopy(callback_param)
check(self)
Source code in federatedml/param/boosting_param.py
def check(self):

    super(HeteroFastSecureBoostParam, self).check()
    if type(self.guest_depth).__name__ not in ["int", "long"] or self.guest_depth <= 0:
        raise ValueError("guest_depth should be larger than 0")
    if type(self.host_depth).__name__ not in ["int", "long"] or self.host_depth <= 0:
        raise ValueError("host_depth should be larger than 0")
    if type(self.tree_num_per_party).__name__ not in ["int", "long"] or self.tree_num_per_party <= 0:
        raise ValueError("tree_num_per_party should be larger than 0")

    work_modes = [consts.MIX_TREE, consts.LAYERED_TREE]
    if self.work_mode not in work_modes:
        raise ValueError('only work_modes: {} are supported, input work mode is {}'.
                         format(work_modes, self.work_mode))

    return True
HomoSecureBoostParam (BoostingParam)

Parameters:

Name Type Description Default
backend {'distributed', 'memory'}

decides which backend to use when computing histograms for homo-sbt

'distributed'
Source code in federatedml/param/boosting_param.py
class HomoSecureBoostParam(BoostingParam):

    """
    Parameters
    ----------
    backend: {'distributed', 'memory'}
        decides which backend to use when computing histograms for homo-sbt
    """

    def __init__(self, tree_param: DecisionTreeParam = DecisionTreeParam(), task_type=consts.CLASSIFICATION,
                 objective_param=ObjectiveParam(),
                 learning_rate=0.3, num_trees=5, subsample_feature_rate=1, n_iter_no_change=True,
                 tol=0.0001, bin_num=32, predict_param=PredictParam(), cv_param=CrossValidationParam(),
                 validation_freqs=None, use_missing=False, zero_as_missing=False, random_seed=100,
                 binning_error=consts.DEFAULT_RELATIVE_ERROR, backend=consts.DISTRIBUTED_BACKEND,
                 callback_param=CallbackParam()):
        super(HomoSecureBoostParam, self).__init__(task_type=task_type,
                                                   objective_param=objective_param,
                                                   learning_rate=learning_rate,
                                                   num_trees=num_trees,
                                                   subsample_feature_rate=subsample_feature_rate,
                                                   n_iter_no_change=n_iter_no_change,
                                                   tol=tol,
                                                   bin_num=bin_num,
                                                   predict_param=predict_param,
                                                   cv_param=cv_param,
                                                   validation_freqs=validation_freqs,
                                                   random_seed=random_seed,
                                                   binning_error=binning_error
                                                   )
        self.use_missing = use_missing
        self.zero_as_missing = zero_as_missing
        self.tree_param = copy.deepcopy(tree_param)
        self.backend = backend
        self.callback_param = copy.deepcopy(callback_param)

    def check(self):

        super(HomoSecureBoostParam, self).check()
        self.tree_param.check()
        if type(self.use_missing) != bool:
            raise ValueError('use missing should be bool type')
        if type(self.zero_as_missing) != bool:
            raise ValueError('zero as missing should be bool type')
        if self.backend not in [consts.MEMORY_BACKEND, consts.DISTRIBUTED_BACKEND]:
            raise ValueError('unsupported backend')

        for p in ["validation_freqs", "metrics"]:
            # if self._warn_to_deprecate_param(p, "", ""):
            if self._deprecated_params_set.get(p):
                if "callback_param" in self.get_user_feeded():
                    raise ValueError(f"{p} and callback param should not be set simultaneously,"
                                     f"{self._deprecated_params_set}, {self.get_user_feeded()}")
                else:
                    self.callback_param.callbacks = ["PerformanceEvaluate"]
                break

        descr = "boosting_param's"

        if self._warn_to_deprecate_param("validation_freqs", descr, "callback_param's 'validation_freqs'"):
            self.callback_param.validation_freqs = self.validation_freqs

        if self._warn_to_deprecate_param("metrics", descr, "callback_param's 'metrics'"):
            self.callback_param.metrics = self.metrics

        return True
__init__(self, tree_param=<federatedml.param.boosting_param.DecisionTreeParam object at 0x7f27552bb190>, task_type='classification', objective_param=<federatedml.param.boosting_param.ObjectiveParam object at 0x7f27552bb3d0>, learning_rate=0.3, num_trees=5, subsample_feature_rate=1, n_iter_no_change=True, tol=0.0001, bin_num=32, predict_param=<federatedml.param.predict_param.PredictParam object at 0x7f27552bb450>, cv_param=<federatedml.param.cross_validation_param.CrossValidationParam object at 0x7f27552bb490>, validation_freqs=None, use_missing=False, zero_as_missing=False, random_seed=100, binning_error=0.0001, backend='distributed', callback_param=<federatedml.param.callback_param.CallbackParam object at 0x7f27552bb510>) special
Source code in federatedml/param/boosting_param.py
def __init__(self, tree_param: DecisionTreeParam = DecisionTreeParam(), task_type=consts.CLASSIFICATION,
             objective_param=ObjectiveParam(),
             learning_rate=0.3, num_trees=5, subsample_feature_rate=1, n_iter_no_change=True,
             tol=0.0001, bin_num=32, predict_param=PredictParam(), cv_param=CrossValidationParam(),
             validation_freqs=None, use_missing=False, zero_as_missing=False, random_seed=100,
             binning_error=consts.DEFAULT_RELATIVE_ERROR, backend=consts.DISTRIBUTED_BACKEND,
             callback_param=CallbackParam()):
    super(HomoSecureBoostParam, self).__init__(task_type=task_type,
                                               objective_param=objective_param,
                                               learning_rate=learning_rate,
                                               num_trees=num_trees,
                                               subsample_feature_rate=subsample_feature_rate,
                                               n_iter_no_change=n_iter_no_change,
                                               tol=tol,
                                               bin_num=bin_num,
                                               predict_param=predict_param,
                                               cv_param=cv_param,
                                               validation_freqs=validation_freqs,
                                               random_seed=random_seed,
                                               binning_error=binning_error
                                               )
    self.use_missing = use_missing
    self.zero_as_missing = zero_as_missing
    self.tree_param = copy.deepcopy(tree_param)
    self.backend = backend
    self.callback_param = copy.deepcopy(callback_param)
check(self)
Source code in federatedml/param/boosting_param.py
def check(self):

    super(HomoSecureBoostParam, self).check()
    self.tree_param.check()
    if type(self.use_missing) != bool:
        raise ValueError('use missing should be bool type')
    if type(self.zero_as_missing) != bool:
        raise ValueError('zero as missing should be bool type')
    if self.backend not in [consts.MEMORY_BACKEND, consts.DISTRIBUTED_BACKEND]:
        raise ValueError('unsupported backend')

    for p in ["validation_freqs", "metrics"]:
        # if self._warn_to_deprecate_param(p, "", ""):
        if self._deprecated_params_set.get(p):
            if "callback_param" in self.get_user_feeded():
                raise ValueError(f"{p} and callback param should not be set simultaneously,"
                                 f"{self._deprecated_params_set}, {self.get_user_feeded()}")
            else:
                self.callback_param.callbacks = ["PerformanceEvaluate"]
            break

    descr = "boosting_param's"

    if self._warn_to_deprecate_param("validation_freqs", descr, "callback_param's 'validation_freqs'"):
        self.callback_param.validation_freqs = self.validation_freqs

    if self._warn_to_deprecate_param("metrics", descr, "callback_param's 'metrics'"):
        self.callback_param.metrics = self.metrics

    return True
callback_param
Classes
CallbackParam (BaseParam)

Define callback method that used in federated ml.

Parameters:

Name Type Description Default
callbacks list, default: []

Indicate what kinds of callback functions is desired during the training process. Accepted values: {'EarlyStopping', 'ModelCheckpoint', 'PerformanceEvaluate'}

None
validation_freqs {None, int, list, tuple, set}

validation frequency during training.

None
early_stopping_rounds None or int

Will stop training if one metric doesn’t improve in last early_stopping_round rounds

None
metrics None, or list

Indicate when executing evaluation during train process, which metrics will be used. If set as empty, default metrics for specific task type will be used. As for binary classification, default metrics are ['auc', 'ks']

None
use_first_metric_only bool, default: False

Indicate whether use the first metric only for early stopping judgement.

False
save_freq int, default: 1

The callbacks save model every save_freq epoch

1
Source code in federatedml/param/callback_param.py
class CallbackParam(BaseParam):
    """
    Define callback method that used in federated ml.

    Parameters
    ----------
    callbacks : list, default: []
        Indicate what kinds of callback functions is desired during the training process.
        Accepted values: {'EarlyStopping', 'ModelCheckpoint', 'PerformanceEvaluate'}

    validation_freqs: {None, int, list, tuple, set}
        validation frequency during training.

    early_stopping_rounds: None or int
        Will stop training if one metric doesn’t improve in last early_stopping_round rounds

    metrics: None, or list
        Indicate when executing evaluation during train process, which metrics will be used. If set as empty,
        default metrics for specific task type will be used. As for binary classification, default metrics are
        ['auc', 'ks']

    use_first_metric_only: bool, default: False
        Indicate whether use the first metric only for early stopping judgement.

    save_freq: int, default: 1
        The callbacks save model every save_freq epoch


    """

    def __init__(self, callbacks=None, validation_freqs=None, early_stopping_rounds=None,
                 metrics=None, use_first_metric_only=False, save_freq=1):
        super(CallbackParam, self).__init__()
        self.callbacks = callbacks or []
        self.validation_freqs = validation_freqs
        self.early_stopping_rounds = early_stopping_rounds
        self.metrics = metrics or []
        self.use_first_metric_only = use_first_metric_only
        self.save_freq = save_freq

    def check(self):

        if self.early_stopping_rounds is None:
            pass
        elif isinstance(self.early_stopping_rounds, int):
            if self.early_stopping_rounds < 1:
                raise ValueError("early stopping rounds should be larger than 0 when it's integer")
            if self.validation_freqs is None:
                raise ValueError("validation freqs must be set when early stopping is enabled")

        if self.metrics is not None and not isinstance(self.metrics, list):
            raise ValueError("metrics should be a list")

        if not isinstance(self.use_first_metric_only, bool):
            raise ValueError("use_first_metric_only should be a boolean")

        return True
__init__(self, callbacks=None, validation_freqs=None, early_stopping_rounds=None, metrics=None, use_first_metric_only=False, save_freq=1) special
Source code in federatedml/param/callback_param.py
def __init__(self, callbacks=None, validation_freqs=None, early_stopping_rounds=None,
             metrics=None, use_first_metric_only=False, save_freq=1):
    super(CallbackParam, self).__init__()
    self.callbacks = callbacks or []
    self.validation_freqs = validation_freqs
    self.early_stopping_rounds = early_stopping_rounds
    self.metrics = metrics or []
    self.use_first_metric_only = use_first_metric_only
    self.save_freq = save_freq
check(self)
Source code in federatedml/param/callback_param.py
def check(self):

    if self.early_stopping_rounds is None:
        pass
    elif isinstance(self.early_stopping_rounds, int):
        if self.early_stopping_rounds < 1:
            raise ValueError("early stopping rounds should be larger than 0 when it's integer")
        if self.validation_freqs is None:
            raise ValueError("validation freqs must be set when early stopping is enabled")

    if self.metrics is not None and not isinstance(self.metrics, list):
        raise ValueError("metrics should be a list")

    if not isinstance(self.use_first_metric_only, bool):
        raise ValueError("use_first_metric_only should be a boolean")

    return True
column_expand_param
Classes
ColumnExpandParam (BaseParam)

Define method used for expanding column

Parameters:

Name Type Description Default
append_header None or str or List[str], default: None

Name(s) for appended feature(s). If None is given, module outputs the original input value without any operation.

None
method str, default: 'manual'

If method is 'manual', use user-specified fill_value to fill in new features.

'manual'
fill_value int or float or str or List[int] or List[float] or List[str], default: 1e-8

Used for filling expanded feature columns. If given a list, length of the list must match that of append_header

1e-08
need_run bool, default: True

Indicate if this module needed to be run.

True
Source code in federatedml/param/column_expand_param.py
class ColumnExpandParam(BaseParam):
    """
    Define method used for expanding column

    Parameters
    ----------

    append_header : None or str or List[str], default: None
        Name(s) for appended feature(s). If None is given, module outputs the original input value without any operation.

    method : str, default: 'manual'
        If method is 'manual', use user-specified `fill_value` to fill in new features.

    fill_value : int or float or str or List[int] or List[float] or List[str], default: 1e-8
        Used for filling expanded feature columns. If given a list, length of the list must match that of `append_header`

    need_run: bool, default: True
        Indicate if this module needed to be run.

    """

    def __init__(self, append_header=None, method="manual",
                 fill_value=consts.FLOAT_ZERO, need_run=True):
        super(ColumnExpandParam, self).__init__()
        self.append_header = [] if append_header is None else append_header
        self.method = method
        self.fill_value = fill_value
        self.need_run = need_run

    def check(self):
        descr = "column_expand param's "
        if not isinstance(self.method, str):
            raise ValueError(f"{descr}method {self.method} not supported, should be str type")
        else:
            user_input = self.method.lower()
            if user_input == "manual":
                self.method = consts.MANUAL
            else:
                raise ValueError(f"{descr} method {user_input} not supported")

        BaseParam.check_boolean(self.need_run, descr=descr)

        if not isinstance(self.append_header, list):
            raise ValueError(f"{descr} append_header must be None or list of str. "
                             f"Received {type(self.append_header)} instead.")
        for feature_name in self.append_header:
            BaseParam.check_string(feature_name, descr+"append_header values")

        if isinstance(self.fill_value, list):
            if len(self.append_header) != len(self.fill_value):
                raise ValueError(
                    f"{descr} `fill value` is set to be list, "
                    f"and param `append_header` must also be list of the same length.")
        else:
            self.fill_value = [self.fill_value]
        for value in self.fill_value:
            if type(value).__name__ not in ["float", "int", "long", "str"]:
                raise ValueError(
                    f"{descr} fill value(s) must be float, int, or str. Received type {type(value)} instead.")

        LOGGER.debug("Finish column expand parameter check!")
        return True
__init__(self, append_header=None, method='manual', fill_value=1e-08, need_run=True) special
Source code in federatedml/param/column_expand_param.py
def __init__(self, append_header=None, method="manual",
             fill_value=consts.FLOAT_ZERO, need_run=True):
    super(ColumnExpandParam, self).__init__()
    self.append_header = [] if append_header is None else append_header
    self.method = method
    self.fill_value = fill_value
    self.need_run = need_run
check(self)
Source code in federatedml/param/column_expand_param.py
def check(self):
    descr = "column_expand param's "
    if not isinstance(self.method, str):
        raise ValueError(f"{descr}method {self.method} not supported, should be str type")
    else:
        user_input = self.method.lower()
        if user_input == "manual":
            self.method = consts.MANUAL
        else:
            raise ValueError(f"{descr} method {user_input} not supported")

    BaseParam.check_boolean(self.need_run, descr=descr)

    if not isinstance(self.append_header, list):
        raise ValueError(f"{descr} append_header must be None or list of str. "
                         f"Received {type(self.append_header)} instead.")
    for feature_name in self.append_header:
        BaseParam.check_string(feature_name, descr+"append_header values")

    if isinstance(self.fill_value, list):
        if len(self.append_header) != len(self.fill_value):
            raise ValueError(
                f"{descr} `fill value` is set to be list, "
                f"and param `append_header` must also be list of the same length.")
    else:
        self.fill_value = [self.fill_value]
    for value in self.fill_value:
        if type(value).__name__ not in ["float", "int", "long", "str"]:
            raise ValueError(
                f"{descr} fill value(s) must be float, int, or str. Received type {type(value)} instead.")

    LOGGER.debug("Finish column expand parameter check!")
    return True
cross_validation_param
Classes
CrossValidationParam (BaseParam)

Define cross validation params

Parameters:

Name Type Description Default
n_splits int, default: 5

Specify how many splits used in KFold

5
mode str, default: 'Hetero'

Indicate what mode is current task

'hetero'
role {'Guest', 'Host', 'Arbiter'}, default: 'Guest'

Indicate what role is current party

'guest'
shuffle bool, default: True

Define whether do shuffle before KFold or not.

True
random_seed int, default: 1

Specify the random seed for numpy shuffle

1
need_cv bool, default False

Indicate if this module needed to be run

False
output_fold_history bool, default True

Indicate whether to output table of ids used by each fold, else return original input data returned ids are formatted as: {original_id}#fold{fold_num}#{train/validate}

True
history_value_type {'score', 'instance'}, default score

Indicate whether to include original instance or predict score in the output fold history, only effective when output_fold_history set to True

'score'
Source code in federatedml/param/cross_validation_param.py
class CrossValidationParam(BaseParam):
    """
    Define cross validation params

    Parameters
    ----------
    n_splits: int, default: 5
        Specify how many splits used in KFold

    mode: str, default: 'Hetero'
        Indicate what mode is current task

    role: {'Guest', 'Host', 'Arbiter'}, default: 'Guest'
        Indicate what role is current party

    shuffle: bool, default: True
        Define whether do shuffle before KFold or not.

    random_seed: int, default: 1
        Specify the random seed for numpy shuffle

    need_cv: bool, default False
        Indicate if this module needed to be run

    output_fold_history: bool, default True
        Indicate whether to output table of ids used by each fold, else return original input data
        returned ids are formatted as: {original_id}#fold{fold_num}#{train/validate}

    history_value_type: {'score', 'instance'}, default score
        Indicate whether to include original instance or predict score in the output fold history,
        only effective when output_fold_history set to True

    """

    def __init__(self, n_splits=5, mode=consts.HETERO, role=consts.GUEST, shuffle=True, random_seed=1,
                 need_cv=False, output_fold_history=True, history_value_type="score"):
        super(CrossValidationParam, self).__init__()
        self.n_splits = n_splits
        self.mode = mode
        self.role = role
        self.shuffle = shuffle
        self.random_seed = random_seed
        # self.evaluate_param = copy.deepcopy(evaluate_param)
        self.need_cv = need_cv
        self.output_fold_history = output_fold_history
        self.history_value_type = history_value_type

    def check(self):
        model_param_descr = "cross validation param's "
        self.check_positive_integer(self.n_splits, model_param_descr)
        self.check_valid_value(self.mode, model_param_descr, valid_values=[consts.HOMO, consts.HETERO])
        self.check_valid_value(self.role, model_param_descr, valid_values=[consts.HOST, consts.GUEST, consts.ARBITER])
        self.check_boolean(self.shuffle, model_param_descr)
        self.check_boolean(self.output_fold_history, model_param_descr)
        self.history_value_type = self.check_and_change_lower(self.history_value_type, ["instance", "score"], model_param_descr)
        if self.random_seed is not None:
            self.check_positive_integer(self.random_seed, model_param_descr)
__init__(self, n_splits=5, mode='hetero', role='guest', shuffle=True, random_seed=1, need_cv=False, output_fold_history=True, history_value_type='score') special
Source code in federatedml/param/cross_validation_param.py
def __init__(self, n_splits=5, mode=consts.HETERO, role=consts.GUEST, shuffle=True, random_seed=1,
             need_cv=False, output_fold_history=True, history_value_type="score"):
    super(CrossValidationParam, self).__init__()
    self.n_splits = n_splits
    self.mode = mode
    self.role = role
    self.shuffle = shuffle
    self.random_seed = random_seed
    # self.evaluate_param = copy.deepcopy(evaluate_param)
    self.need_cv = need_cv
    self.output_fold_history = output_fold_history
    self.history_value_type = history_value_type
check(self)
Source code in federatedml/param/cross_validation_param.py
def check(self):
    model_param_descr = "cross validation param's "
    self.check_positive_integer(self.n_splits, model_param_descr)
    self.check_valid_value(self.mode, model_param_descr, valid_values=[consts.HOMO, consts.HETERO])
    self.check_valid_value(self.role, model_param_descr, valid_values=[consts.HOST, consts.GUEST, consts.ARBITER])
    self.check_boolean(self.shuffle, model_param_descr)
    self.check_boolean(self.output_fold_history, model_param_descr)
    self.history_value_type = self.check_and_change_lower(self.history_value_type, ["instance", "score"], model_param_descr)
    if self.random_seed is not None:
        self.check_positive_integer(self.random_seed, model_param_descr)
data_split_param
Classes
DataSplitParam (BaseParam)

Define data split param that used in data split.

Parameters:

Name Type Description Default
random_state None or int, default: None

Specify the random state for shuffle.

None
test_size float or int or None, default: 0.0

Specify test data set size. float value specifies fraction of input data set, int value specifies exact number of data instances

None
train_size float or int or None, default: 0.8

Specify train data set size. float value specifies fraction of input data set, int value specifies exact number of data instances

None
validate_size float or int or None, default: 0.2

Specify validate data set size. float value specifies fraction of input data set, int value specifies exact number of data instances

None
stratified bool, default: False

Define whether sampling should be stratified, according to label value.

False
shuffle bool, default: True

Define whether do shuffle before splitting or not.

True
split_points None or list, default : None

Specify the point(s) by which continuous label values are bucketed into bins for stratified split. eg.[0.2] for two bins or [0.1, 1, 3] for 4 bins

None
need_run bool, default: True

Specify whether to run data split

True
Source code in federatedml/param/data_split_param.py
class DataSplitParam(BaseParam):
    """
    Define data split param that used in data split.

    Parameters
    ----------
    random_state : None or int, default: None
        Specify the random state for shuffle.

    test_size : float or int or None, default: 0.0
        Specify test data set size.
        float value specifies fraction of input data set, int value specifies exact number of data instances

    train_size : float or int or None, default: 0.8
        Specify train data set size.
        float value specifies fraction of input data set, int value specifies exact number of data instances

    validate_size : float or int or None, default: 0.2
        Specify validate data set size.
        float value specifies fraction of input data set, int value specifies exact number of data instances

    stratified : bool, default: False
        Define whether sampling should be stratified, according to label value.

    shuffle : bool, default: True
        Define whether do shuffle before splitting or not.

    split_points : None or list, default : None
        Specify the point(s) by which continuous label values are bucketed into bins for stratified split.
        eg.[0.2] for two bins or [0.1, 1, 3] for 4 bins

    need_run: bool, default: True
        Specify whether to run data split

    """

    def __init__(self, random_state=None, test_size=None, train_size=None, validate_size=None, stratified=False,
                 shuffle=True, split_points=None, need_run=True):
        super(DataSplitParam, self).__init__()
        self.random_state = random_state
        self.test_size = test_size
        self.train_size = train_size
        self.validate_size = validate_size
        self.stratified = stratified
        self.shuffle = shuffle
        self.split_points = split_points
        self.need_run = need_run

    def check(self):
        model_param_descr = "data split param's "
        if self.random_state is not None:
            if not isinstance(self.random_state, int):
                raise ValueError(f"{model_param_descr} random state should be int type")
            BaseParam.check_nonnegative_number(self.random_state, f"{model_param_descr} random_state ")

        if self.test_size is not None:
            BaseParam.check_nonnegative_number(self.test_size, f"{model_param_descr} test_size ")
            if isinstance(self.test_size, float):
                BaseParam.check_decimal_float(self.test_size, f"{model_param_descr} test_size ")
        if self.train_size is not None:
            BaseParam.check_nonnegative_number(self.train_size, f"{model_param_descr} train_size ")
            if isinstance(self.train_size, float):
                BaseParam.check_decimal_float(self.train_size, f"{model_param_descr} train_size ")
        if self.validate_size is not None:
            BaseParam.check_nonnegative_number(self.validate_size, f"{model_param_descr} validate_size ")
            if isinstance(self.validate_size, float):
                BaseParam.check_decimal_float(self.validate_size, f"{model_param_descr} validate_size ")
        # use default size values if none given
        if self.test_size is None and self.train_size is None and self.validate_size is None:
            self.test_size = 0.0
            self.train_size = 0.8
            self.validate_size = 0.2

        BaseParam.check_boolean(self.stratified, f"{model_param_descr} stratified ")
        BaseParam.check_boolean(self.shuffle, f"{model_param_descr} shuffle ")
        BaseParam.check_boolean(self.need_run, f"{model_param_descr} need run ")

        if self.split_points is not None:
            if not isinstance(self.split_points, list):
                raise ValueError(f"{model_param_descr} split_points should be list type")

        LOGGER.debug("Finish data_split parameter check!")
        return True
__init__(self, random_state=None, test_size=None, train_size=None, validate_size=None, stratified=False, shuffle=True, split_points=None, need_run=True) special
Source code in federatedml/param/data_split_param.py
def __init__(self, random_state=None, test_size=None, train_size=None, validate_size=None, stratified=False,
             shuffle=True, split_points=None, need_run=True):
    super(DataSplitParam, self).__init__()
    self.random_state = random_state
    self.test_size = test_size
    self.train_size = train_size
    self.validate_size = validate_size
    self.stratified = stratified
    self.shuffle = shuffle
    self.split_points = split_points
    self.need_run = need_run
check(self)
Source code in federatedml/param/data_split_param.py
def check(self):
    model_param_descr = "data split param's "
    if self.random_state is not None:
        if not isinstance(self.random_state, int):
            raise ValueError(f"{model_param_descr} random state should be int type")
        BaseParam.check_nonnegative_number(self.random_state, f"{model_param_descr} random_state ")

    if self.test_size is not None:
        BaseParam.check_nonnegative_number(self.test_size, f"{model_param_descr} test_size ")
        if isinstance(self.test_size, float):
            BaseParam.check_decimal_float(self.test_size, f"{model_param_descr} test_size ")
    if self.train_size is not None:
        BaseParam.check_nonnegative_number(self.train_size, f"{model_param_descr} train_size ")
        if isinstance(self.train_size, float):
            BaseParam.check_decimal_float(self.train_size, f"{model_param_descr} train_size ")
    if self.validate_size is not None:
        BaseParam.check_nonnegative_number(self.validate_size, f"{model_param_descr} validate_size ")
        if isinstance(self.validate_size, float):
            BaseParam.check_decimal_float(self.validate_size, f"{model_param_descr} validate_size ")
    # use default size values if none given
    if self.test_size is None and self.train_size is None and self.validate_size is None:
        self.test_size = 0.0
        self.train_size = 0.8
        self.validate_size = 0.2

    BaseParam.check_boolean(self.stratified, f"{model_param_descr} stratified ")
    BaseParam.check_boolean(self.shuffle, f"{model_param_descr} shuffle ")
    BaseParam.check_boolean(self.need_run, f"{model_param_descr} need run ")

    if self.split_points is not None:
        if not isinstance(self.split_points, list):
            raise ValueError(f"{model_param_descr} split_points should be list type")

    LOGGER.debug("Finish data_split parameter check!")
    return True
data_transform_param
Classes
DataTransformParam (BaseParam)

Define data transform parameters that used in federated ml.

Parameters:

Name Type Description Default
input_format {'dense', 'sparse', 'tag'}

please have a look at this tutorial at "DataTransform" section of federatedml/util/README.md. Formally, dense input format data should be set to "dense", svm-light input format data should be set to "sparse", tag or tag:value input format data should be set to "tag".

'dense'
delimitor str

the delimitor of data input, default: ','

','
data_type int

{'float64','float','int','int64','str','long'} the data type of data input

'float64'
exclusive_data_type dict

the key of dict is col_name, the value is data_type, use to specified special data type of some features.

None
tag_with_value bool

use if input_format is 'tag', if tag_with_value is True, input column data format should be tag[delimitor]value, otherwise is tag only

False
tag_value_delimitor str

use if input_format is 'tag' and 'tag_with_value' is True, delimitor of tag[delimitor]value column value.

':'
missing_fill bool

need to fill missing value or not, accepted only True/False, default: False

False
default_value None or object or list

the value to replace missing value. if None, it will use default value define in federatedml/feature/imputer.py, if single object, will fill missing value with this object, if list, it's length should be the sample of input data' feature dimension, means that if some column happens to have missing values, it will replace it the value by element in the identical position of this list.

0
missing_fill_method None or str

the method to replace missing value, should be one of [None, 'min', 'max', 'mean', 'designated']

None
missing_impute None or list

element of list can be any type, or auto generated if value is None, define which values to be consider as missing

None
outlier_replace bool

need to replace outlier value or not, accepted only True/False, default: True

False
outlier_replace_method None or str

the method to replace missing value, should be one of [None, 'min', 'max', 'mean', 'designated']

None
outlier_impute None or list

element of list can be any type, which values should be regard as missing value

None
outlier_replace_value None or object or list

the value to replace outlier. if None, it will use default value define in federatedml/feature/imputer.py, if single object, will replace outlier with this object, if list, it's length should be the sample of input data' feature dimension, means that if some column happens to have outliers, it will replace it the value by element in the identical position of this list.

0
with_label bool

True if input data consist of label, False otherwise. default: 'false'

False
label_name str

column_name of the column where label locates, only use in dense-inputformat. default: 'y'

'y'
label_type {'int','int64','float','float64','long','str'}

use when with_label is True

'int'
output_format {'dense', 'sparse'}

output format

'dense'
with_match_id bool

True if dataset has match_id, default: False

False
Source code in federatedml/param/data_transform_param.py
class DataTransformParam(BaseParam):
    """
    Define data transform parameters that used in federated ml.

    Parameters
    ----------
    input_format : {'dense', 'sparse', 'tag'}
        please have a look at this tutorial at "DataTransform" section of federatedml/util/README.md.
        Formally,
            dense input format data should be set to "dense",
            svm-light input format data should be set to "sparse",
            tag or tag:value input format data should be set to "tag".

    delimitor : str 
        the delimitor of data input, default: ','

    data_type : int
        {'float64','float','int','int64','str','long'}
        the data type of data input

    exclusive_data_type : dict 
        the key of dict is col_name, the value is data_type, use to specified special data type
        of some features.

    tag_with_value: bool
        use if input_format is 'tag', if tag_with_value is True,
        input column data format should be tag[delimitor]value, otherwise is tag only

    tag_value_delimitor: str
        use if input_format is 'tag' and 'tag_with_value' is True,
        delimitor of tag[delimitor]value column value.

    missing_fill : bool
        need to fill missing value or not, accepted only True/False, default: False

    default_value : None or object or list
        the value to replace missing value.
        if None, it will use default value define in federatedml/feature/imputer.py,
        if single object, will fill missing value with this object,
        if list, it's length should be the sample of input data' feature dimension,
        means that if some column happens to have missing values, it will replace it
        the value by element in the identical position of this list.

    missing_fill_method: None or str
        the method to replace missing value, should be one of [None, 'min', 'max', 'mean', 'designated']

    missing_impute: None or list
        element of list can be any type, or auto generated if value is None, define which values to be consider as missing

    outlier_replace: bool
        need to replace outlier value or not, accepted only True/False, default: True

    outlier_replace_method: None or str
        the method to replace missing value, should be one of [None, 'min', 'max', 'mean', 'designated']

    outlier_impute: None or list
        element of list can be any type, which values should be regard as missing value

    outlier_replace_value: None or object or list
        the value to replace outlier.
        if None, it will use default value define in federatedml/feature/imputer.py,
        if single object, will replace outlier with this object,
        if list, it's length should be the sample of input data' feature dimension,
        means that if some column happens to have outliers, it will replace it
        the value by element in the identical position of this list.

    with_label : bool
        True if input data consist of label, False otherwise. default: 'false'

    label_name : str
        column_name of the column where label locates, only use in dense-inputformat. default: 'y'

    label_type : {'int','int64','float','float64','long','str'}
        use when with_label is True

    output_format : {'dense', 'sparse'}
        output format

    with_match_id: bool
        True if dataset has match_id, default: False

    """
    def __init__(self, input_format="dense", delimitor=',', data_type='float64',
                 exclusive_data_type=None,
                 tag_with_value=False, tag_value_delimitor=":",
                 missing_fill=False, default_value=0, missing_fill_method=None,
                 missing_impute=None, outlier_replace=False, outlier_replace_method=None,
                 outlier_impute=None, outlier_replace_value=0,
                 with_label=False, label_name='y',
                 label_type='int', output_format='dense', need_run=True,
                 with_match_id=False):
        self.input_format = input_format
        self.delimitor = delimitor
        self.data_type = data_type
        self.exclusive_data_type = exclusive_data_type
        self.tag_with_value = tag_with_value
        self.tag_value_delimitor = tag_value_delimitor
        self.missing_fill = missing_fill
        self.default_value = default_value
        self.missing_fill_method = missing_fill_method
        self.missing_impute = missing_impute
        self.outlier_replace = outlier_replace
        self.outlier_replace_method = outlier_replace_method
        self.outlier_impute = outlier_impute
        self.outlier_replace_value = outlier_replace_value
        self.with_label = with_label
        self.label_name = label_name
        self.label_type = label_type
        self.output_format = output_format
        self.need_run = need_run
        self.with_match_id = with_match_id

    def check(self):

        descr = "data_transform param's"

        self.input_format = self.check_and_change_lower(self.input_format,
                                                        ["dense", "sparse", "tag"],
                                                        descr)

        self.output_format = self.check_and_change_lower(self.output_format,
                                                         ["dense", "sparse"],
                                                         descr)

        self.data_type = self.check_and_change_lower(self.data_type,
                                                     ["int", "int64", "float", "float64", "str", "long"],
                                                     descr)

        if type(self.missing_fill).__name__ != 'bool':
            raise ValueError("data_transform param's missing_fill {} not supported".format(self.missing_fill))

        if self.missing_fill_method is not None:
            self.missing_fill_method = self.check_and_change_lower(self.missing_fill_method,
                                                                   ['min', 'max', 'mean', 'designated'],
                                                                   descr)

        if self.outlier_replace_method is not None:
            self.outlier_replace_method = self.check_and_change_lower(self.outlier_replace_method,
                                                                      ['min', 'max', 'mean', 'designated'],
                                                                      descr)

        if type(self.with_label).__name__ != 'bool':
            raise ValueError("data_transform param's with_label {} not supported".format(self.with_label))

        if self.with_label:
            if not isinstance(self.label_name, str):
                raise ValueError("data transform param's label_name {} should be str".format(self.label_name))

            self.label_type = self.check_and_change_lower(self.label_type,
                                                          ["int", "int64", "float", "float64", "str", "long"],
                                                          descr)

        if self.exclusive_data_type is not None and not isinstance(self.exclusive_data_type, dict):
            raise ValueError("exclusive_data_type is should be None or a dict")

        if not isinstance(self.with_match_id, bool):
            raise ValueError("with_match_id should be boolean variable, but {} find".format(self.with_match_id))

        return True
__init__(self, input_format='dense', delimitor=',', data_type='float64', exclusive_data_type=None, tag_with_value=False, tag_value_delimitor=':', missing_fill=False, default_value=0, missing_fill_method=None, missing_impute=None, outlier_replace=False, outlier_replace_method=None, outlier_impute=None, outlier_replace_value=0, with_label=False, label_name='y', label_type='int', output_format='dense', need_run=True, with_match_id=False) special
Source code in federatedml/param/data_transform_param.py
def __init__(self, input_format="dense", delimitor=',', data_type='float64',
             exclusive_data_type=None,
             tag_with_value=False, tag_value_delimitor=":",
             missing_fill=False, default_value=0, missing_fill_method=None,
             missing_impute=None, outlier_replace=False, outlier_replace_method=None,
             outlier_impute=None, outlier_replace_value=0,
             with_label=False, label_name='y',
             label_type='int', output_format='dense', need_run=True,
             with_match_id=False):
    self.input_format = input_format
    self.delimitor = delimitor
    self.data_type = data_type
    self.exclusive_data_type = exclusive_data_type
    self.tag_with_value = tag_with_value
    self.tag_value_delimitor = tag_value_delimitor
    self.missing_fill = missing_fill
    self.default_value = default_value
    self.missing_fill_method = missing_fill_method
    self.missing_impute = missing_impute
    self.outlier_replace = outlier_replace
    self.outlier_replace_method = outlier_replace_method
    self.outlier_impute = outlier_impute
    self.outlier_replace_value = outlier_replace_value
    self.with_label = with_label
    self.label_name = label_name
    self.label_type = label_type
    self.output_format = output_format
    self.need_run = need_run
    self.with_match_id = with_match_id
check(self)
Source code in federatedml/param/data_transform_param.py
def check(self):

    descr = "data_transform param's"

    self.input_format = self.check_and_change_lower(self.input_format,
                                                    ["dense", "sparse", "tag"],
                                                    descr)

    self.output_format = self.check_and_change_lower(self.output_format,
                                                     ["dense", "sparse"],
                                                     descr)

    self.data_type = self.check_and_change_lower(self.data_type,
                                                 ["int", "int64", "float", "float64", "str", "long"],
                                                 descr)

    if type(self.missing_fill).__name__ != 'bool':
        raise ValueError("data_transform param's missing_fill {} not supported".format(self.missing_fill))

    if self.missing_fill_method is not None:
        self.missing_fill_method = self.check_and_change_lower(self.missing_fill_method,
                                                               ['min', 'max', 'mean', 'designated'],
                                                               descr)

    if self.outlier_replace_method is not None:
        self.outlier_replace_method = self.check_and_change_lower(self.outlier_replace_method,
                                                                  ['min', 'max', 'mean', 'designated'],
                                                                  descr)

    if type(self.with_label).__name__ != 'bool':
        raise ValueError("data_transform param's with_label {} not supported".format(self.with_label))

    if self.with_label:
        if not isinstance(self.label_name, str):
            raise ValueError("data transform param's label_name {} should be str".format(self.label_name))

        self.label_type = self.check_and_change_lower(self.label_type,
                                                      ["int", "int64", "float", "float64", "str", "long"],
                                                      descr)

    if self.exclusive_data_type is not None and not isinstance(self.exclusive_data_type, dict):
        raise ValueError("exclusive_data_type is should be None or a dict")

    if not isinstance(self.with_match_id, bool):
        raise ValueError("with_match_id should be boolean variable, but {} find".format(self.with_match_id))

    return True
dataio_param
Classes
DataIOParam (BaseParam)

Define dataio parameters that used in federated ml.

Parameters:

Name Type Description Default
input_format {'dense', 'sparse', 'tag'}

please have a look at this tutorial at "DataIO" section of federatedml/util/README.md. Formally, dense input format data should be set to "dense", svm-light input format data should be set to "sparse", tag or tag:value input format data should be set to "tag".

'dense'
delimitor str

the delimitor of data input, default: ','

','
data_type {'float64', 'float', 'int', 'int64', 'str', 'long'}

the data type of data input

'float64'
exclusive_data_type dict

the key of dict is col_name, the value is data_type, use to specified special data type of some features.

None
tag_with_value bool

use if input_format is 'tag', if tag_with_value is True, input column data format should be tag[delimitor]value, otherwise is tag only

False
tag_value_delimitor str

use if input_format is 'tag' and 'tag_with_value' is True, delimitor of tag[delimitor]value column value.

':'
missing_fill bool

need to fill missing value or not, accepted only True/False, default: False

False
default_value None or object or list

the value to replace missing value. if None, it will use default value define in federatedml/feature/imputer.py, if single object, will fill missing value with this object, if list, it's length should be the sample of input data' feature dimension, means that if some column happens to have missing values, it will replace it the value by element in the identical position of this list.

0
missing_fill_method {None, 'min', 'max', 'mean', 'designated'}

the method to replace missing value

None
missing_impute None or list

element of list can be any type, or auto generated if value is None, define which values to be consider as missing

None
outlier_replace bool

need to replace outlier value or not, accepted only True/False, default: True

False
outlier_replace_method {None, 'min', 'max', 'mean', 'designated'}

the method to replace missing value

None
outlier_impute None or list

element of list can be any type, which values should be regard as missing value, default: None

None
outlier_replace_value None or object or list

the value to replace outlier. if None, it will use default value define in federatedml/feature/imputer.py, if single object, will replace outlier with this object, if list, it's length should be the sample of input data' feature dimension, means that if some column happens to have outliers, it will replace it the value by element in the identical position of this list.

0
with_label bool

True if input data consist of label, False otherwise. default: 'false'

False
label_name str

column_name of the column where label locates, only use in dense-inputformat. default: 'y'

'y'
label_type {'int', 'int64', 'float', 'float64', 'long', 'str'}

use when with_label is True.

'int'
output_format {'dense', 'sparse'}

output format

'dense'
Source code in federatedml/param/dataio_param.py
class DataIOParam(BaseParam):
    """
    Define dataio parameters that used in federated ml.

    Parameters
    ----------
    input_format : {'dense', 'sparse', 'tag'}
        please have a look at this tutorial at "DataIO" section of federatedml/util/README.md.
        Formally,
            dense input format data should be set to "dense",
            svm-light input format data should be set to "sparse",
            tag or tag:value input format data should be set to "tag".

    delimitor : str
        the delimitor of data input, default: ','

    data_type : {'float64', 'float', 'int', 'int64', 'str', 'long'}
        the data type of data input

    exclusive_data_type : dict
        the key of dict is col_name, the value is data_type, use to specified special data type 
        of some features.

    tag_with_value: bool
        use if input_format is 'tag', if tag_with_value is True,
        input column data format should be tag[delimitor]value, otherwise is tag only

    tag_value_delimitor: str
        use if input_format is 'tag' and 'tag_with_value' is True,
        delimitor of tag[delimitor]value column value.

    missing_fill : bool
        need to fill missing value or not, accepted only True/False, default: False

    default_value : None or object or list
        the value to replace missing value.
            if None, it will use default value define in federatedml/feature/imputer.py,
            if single object, will fill missing value with this object,
            if list, it's length should be the sample of input data' feature dimension,
                means that if some column happens to have missing values, it will replace it
                the value by element in the identical position of this list.

    missing_fill_method : {None, 'min', 'max', 'mean', 'designated'}
        the method to replace missing value

    missing_impute: None or list
        element of list can be any type, or auto generated if value is None, define which values to be consider as missing

    outlier_replace: bool
        need to replace outlier value or not, accepted only True/False, default: True

    outlier_replace_method : {None, 'min', 'max', 'mean', 'designated'}
        the method to replace missing value

    outlier_impute: None or list
        element of list can be any type, which values should be regard as missing value, default: None

    outlier_replace_value : None or object or list
        the value to replace outlier.
            if None, it will use default value define in federatedml/feature/imputer.py,
            if single object, will replace outlier with this object,
            if list, it's length should be the sample of input data' feature dimension,
                means that if some column happens to have outliers, it will replace it
                the value by element in the identical position of this list.

    with_label : bool
        True if input data consist of label, False otherwise. default: 'false'

    label_name : str
        column_name of the column where label locates, only use in dense-inputformat. default: 'y'

    label_type : {'int', 'int64', 'float', 'float64', 'long', 'str'}
        use when with_label is True.

    output_format : {'dense', 'sparse'}
        output format

    """

    def __init__(self, input_format="dense", delimitor=',', data_type='float64',
                 exclusive_data_type=None,
                 tag_with_value=False, tag_value_delimitor=":",
                 missing_fill=False, default_value=0, missing_fill_method=None,
                 missing_impute=None, outlier_replace=False, outlier_replace_method=None,
                 outlier_impute=None, outlier_replace_value=0,
                 with_label=False, label_name='y',
                 label_type='int', output_format='dense', need_run=True):
        self.input_format = input_format
        self.delimitor = delimitor
        self.data_type = data_type
        self.exclusive_data_type = exclusive_data_type
        self.tag_with_value = tag_with_value
        self.tag_value_delimitor = tag_value_delimitor
        self.missing_fill = missing_fill
        self.default_value = default_value
        self.missing_fill_method = missing_fill_method
        self.missing_impute = missing_impute
        self.outlier_replace = outlier_replace
        self.outlier_replace_method = outlier_replace_method
        self.outlier_impute = outlier_impute
        self.outlier_replace_value = outlier_replace_value
        self.with_label = with_label
        self.label_name = label_name
        self.label_type = label_type
        self.output_format = output_format
        self.need_run = need_run

    def check(self):

        descr = "dataio param's"

        self.input_format = self.check_and_change_lower(self.input_format,
                                                        ["dense", "sparse", "tag"],
                                                        descr)

        self.output_format = self.check_and_change_lower(self.output_format,
                                                         ["dense", "sparse"],
                                                         descr)

        self.data_type = self.check_and_change_lower(self.data_type,
                                                     ["int", "int64", "float", "float64", "str", "long"],
                                                     descr)

        if type(self.missing_fill).__name__ != 'bool':
            raise ValueError("dataio param's missing_fill {} not supported".format(self.missing_fill))

        if self.missing_fill_method is not None:
            self.missing_fill_method = self.check_and_change_lower(self.missing_fill_method,
                                                                   ['min', 'max', 'mean', 'designated'],
                                                                   descr)

        if self.outlier_replace_method is not None:
            self.outlier_replace_method = self.check_and_change_lower(self.outlier_replace_method,
                                                                      ['min', 'max', 'mean', 'designated'],
                                                                      descr)

        if type(self.with_label).__name__ != 'bool':
            raise ValueError("dataio param's with_label {} not supported".format(self.with_label))

        if self.with_label:
            if not isinstance(self.label_name, str):
                raise ValueError("dataio param's label_name {} should be str".format(self.label_name))

            self.label_type = self.check_and_change_lower(self.label_type,
                                                          ["int", "int64", "float", "float64", "str", "long"],
                                                          descr)

        if self.exclusive_data_type is not None and not isinstance(self.exclusive_data_type, dict):
            raise ValueError("exclusive_data_type is should be None or a dict")

        return True
__init__(self, input_format='dense', delimitor=',', data_type='float64', exclusive_data_type=None, tag_with_value=False, tag_value_delimitor=':', missing_fill=False, default_value=0, missing_fill_method=None, missing_impute=None, outlier_replace=False, outlier_replace_method=None, outlier_impute=None, outlier_replace_value=0, with_label=False, label_name='y', label_type='int', output_format='dense', need_run=True) special
Source code in federatedml/param/dataio_param.py
def __init__(self, input_format="dense", delimitor=',', data_type='float64',
             exclusive_data_type=None,
             tag_with_value=False, tag_value_delimitor=":",
             missing_fill=False, default_value=0, missing_fill_method=None,
             missing_impute=None, outlier_replace=False, outlier_replace_method=None,
             outlier_impute=None, outlier_replace_value=0,
             with_label=False, label_name='y',
             label_type='int', output_format='dense', need_run=True):
    self.input_format = input_format
    self.delimitor = delimitor
    self.data_type = data_type
    self.exclusive_data_type = exclusive_data_type
    self.tag_with_value = tag_with_value
    self.tag_value_delimitor = tag_value_delimitor
    self.missing_fill = missing_fill
    self.default_value = default_value
    self.missing_fill_method = missing_fill_method
    self.missing_impute = missing_impute
    self.outlier_replace = outlier_replace
    self.outlier_replace_method = outlier_replace_method
    self.outlier_impute = outlier_impute
    self.outlier_replace_value = outlier_replace_value
    self.with_label = with_label
    self.label_name = label_name
    self.label_type = label_type
    self.output_format = output_format
    self.need_run = need_run
check(self)
Source code in federatedml/param/dataio_param.py
def check(self):

    descr = "dataio param's"

    self.input_format = self.check_and_change_lower(self.input_format,
                                                    ["dense", "sparse", "tag"],
                                                    descr)

    self.output_format = self.check_and_change_lower(self.output_format,
                                                     ["dense", "sparse"],
                                                     descr)

    self.data_type = self.check_and_change_lower(self.data_type,
                                                 ["int", "int64", "float", "float64", "str", "long"],
                                                 descr)

    if type(self.missing_fill).__name__ != 'bool':
        raise ValueError("dataio param's missing_fill {} not supported".format(self.missing_fill))

    if self.missing_fill_method is not None:
        self.missing_fill_method = self.check_and_change_lower(self.missing_fill_method,
                                                               ['min', 'max', 'mean', 'designated'],
                                                               descr)

    if self.outlier_replace_method is not None:
        self.outlier_replace_method = self.check_and_change_lower(self.outlier_replace_method,
                                                                  ['min', 'max', 'mean', 'designated'],
                                                                  descr)

    if type(self.with_label).__name__ != 'bool':
        raise ValueError("dataio param's with_label {} not supported".format(self.with_label))

    if self.with_label:
        if not isinstance(self.label_name, str):
            raise ValueError("dataio param's label_name {} should be str".format(self.label_name))

        self.label_type = self.check_and_change_lower(self.label_type,
                                                      ["int", "int64", "float", "float64", "str", "long"],
                                                      descr)

    if self.exclusive_data_type is not None and not isinstance(self.exclusive_data_type, dict):
        raise ValueError("exclusive_data_type is should be None or a dict")

    return True
encrypt_param
Classes
EncryptParam (BaseParam)

Define encryption method that used in federated ml.

Parameters:

Name Type Description Default
method {'Paillier', 'IterativeAffine', 'RandomIterativeAffine'}

If method is 'Paillier', Paillier encryption will be used for federated ml. To use non-encryption version in HomoLR, set this to None. For detail of Paillier encryption, please check out the paper mentioned in README file.

'Paillier'
key_length int, default: 1024

Used to specify the length of key in this encryption method.

1024
Source code in federatedml/param/encrypt_param.py
class EncryptParam(BaseParam):
    """
    Define encryption method that used in federated ml.

    Parameters
    ----------
    method : {'Paillier', 'IterativeAffine', 'RandomIterativeAffine'}
        If method is 'Paillier', Paillier encryption will be used for federated ml.
        To use non-encryption version in HomoLR, set this to None.
        For detail of Paillier encryption, please check out the paper mentioned in README file.

    key_length : int, default: 1024
        Used to specify the length of key in this encryption method.

    """

    def __init__(self, method=consts.PAILLIER, key_length=1024):
        super(EncryptParam, self).__init__()
        self.method = method
        self.key_length = key_length

    def check(self):
        if self.method is not None and type(self.method).__name__ != "str":
            raise ValueError(
                "encrypt_param's method {} not supported, should be str type".format(
                    self.method))
        elif self.method is None:
            pass
        else:
            user_input = self.method.lower()
            if user_input == "paillier":
                self.method = consts.PAILLIER
            elif user_input == "iterativeaffine":
                self.method = consts.ITERATIVEAFFINE
            elif user_input == "randomiterativeaffine":
                self.method = consts.RANDOM_ITERATIVEAFFINE
            else:
                raise ValueError(
                    "encrypt_param's method {} not supported".format(user_input))

        if type(self.key_length).__name__ != "int":
            raise ValueError(
                "encrypt_param's key_length {} not supported, should be int type".format(self.key_length))
        elif self.key_length <= 0:
            raise ValueError(
                "encrypt_param's key_length must be greater or equal to 1")

        LOGGER.debug("Finish encrypt parameter check!")
        return True
__init__(self, method='Paillier', key_length=1024) special
Source code in federatedml/param/encrypt_param.py
def __init__(self, method=consts.PAILLIER, key_length=1024):
    super(EncryptParam, self).__init__()
    self.method = method
    self.key_length = key_length
check(self)
Source code in federatedml/param/encrypt_param.py
def check(self):
    if self.method is not None and type(self.method).__name__ != "str":
        raise ValueError(
            "encrypt_param's method {} not supported, should be str type".format(
                self.method))
    elif self.method is None:
        pass
    else:
        user_input = self.method.lower()
        if user_input == "paillier":
            self.method = consts.PAILLIER
        elif user_input == "iterativeaffine":
            self.method = consts.ITERATIVEAFFINE
        elif user_input == "randomiterativeaffine":
            self.method = consts.RANDOM_ITERATIVEAFFINE
        else:
            raise ValueError(
                "encrypt_param's method {} not supported".format(user_input))

    if type(self.key_length).__name__ != "int":
        raise ValueError(
            "encrypt_param's key_length {} not supported, should be int type".format(self.key_length))
    elif self.key_length <= 0:
        raise ValueError(
            "encrypt_param's key_length must be greater or equal to 1")

    LOGGER.debug("Finish encrypt parameter check!")
    return True
encrypted_mode_calculation_param
Classes
EncryptedModeCalculatorParam (BaseParam)

Define the encrypted_mode_calulator parameters.

Parameters:

Name Type Description Default
mode {'strict', 'fast', 'balance', 'confusion_opt'}

encrypted mode, default: strict

'strict'
re_encrypted_rate float or int

numeric number in [0, 1], use when mode equals to 'balance', default: 1

1
Source code in federatedml/param/encrypted_mode_calculation_param.py
class EncryptedModeCalculatorParam(BaseParam):
    """
    Define the encrypted_mode_calulator parameters.

    Parameters
    ----------
    mode: {'strict', 'fast', 'balance', 'confusion_opt'}
        encrypted mode, default: strict

    re_encrypted_rate: float or int
        numeric number in [0, 1], use when mode equals to 'balance', default: 1
    """

    def __init__(self, mode="strict", re_encrypted_rate=1):
        self.mode = mode
        self.re_encrypted_rate = re_encrypted_rate

    def check(self):
        descr = "encrypted_mode_calculator param"
        self.mode = self.check_and_change_lower(self.mode,
                                                ["strict", "fast", "balance", "confusion_opt", "confusion_opt_balance"],
                                                descr)

        if self.mode in ["balance", "confusion_opt_balance"]:
            if type(self.re_encrypted_rate).__name__ not in ["int", "long", "float"]:
                raise ValueError("re_encrypted_rate should be a numeric number")

            if not 0.0 <= self.re_encrypted_rate <= 1:
                raise ValueError("re_encrypted_rate should  in [0, 1]")

        return True
__init__(self, mode='strict', re_encrypted_rate=1) special
Source code in federatedml/param/encrypted_mode_calculation_param.py
def __init__(self, mode="strict", re_encrypted_rate=1):
    self.mode = mode
    self.re_encrypted_rate = re_encrypted_rate
check(self)
Source code in federatedml/param/encrypted_mode_calculation_param.py
def check(self):
    descr = "encrypted_mode_calculator param"
    self.mode = self.check_and_change_lower(self.mode,
                                            ["strict", "fast", "balance", "confusion_opt", "confusion_opt_balance"],
                                            descr)

    if self.mode in ["balance", "confusion_opt_balance"]:
        if type(self.re_encrypted_rate).__name__ not in ["int", "long", "float"]:
            raise ValueError("re_encrypted_rate should be a numeric number")

        if not 0.0 <= self.re_encrypted_rate <= 1:
            raise ValueError("re_encrypted_rate should  in [0, 1]")

    return True
evaluation_param
Classes
EvaluateParam (BaseParam)

Define the evaluation method of binary/multiple classification and regression

Parameters:

Name Type Description Default
eval_type {'binary', 'regression', 'multi'}

support 'binary' for HomoLR, HeteroLR and Secureboosting, support 'regression' for Secureboosting, 'multi' is not support these version

'binary'
unfold_multi_result bool

unfold multi result and get several one-vs-rest binary classification results

False
pos_label int or float or str

specify positive label type, depend on the data's label. this parameter effective only for 'binary'

1
need_run bool, default True

Indicate if this module needed to be run

True
Source code in federatedml/param/evaluation_param.py
class EvaluateParam(BaseParam):
    """
    Define the evaluation method of binary/multiple classification and regression

    Parameters
    ----------
    eval_type : {'binary', 'regression', 'multi'}
        support 'binary' for HomoLR, HeteroLR and Secureboosting,
        support 'regression' for Secureboosting,
        'multi' is not support these version

    unfold_multi_result : bool
        unfold multi result and get several one-vs-rest binary classification results

    pos_label : int or float or str
        specify positive label type, depend on the data's label. this parameter effective only for 'binary'

    need_run: bool, default True
        Indicate if this module needed to be run
    """

    def __init__(self, eval_type="binary", pos_label=1, need_run=True, metrics=None,
                 run_clustering_arbiter_metric=False, unfold_multi_result=False):
        super().__init__()
        self.eval_type = eval_type
        self.pos_label = pos_label
        self.need_run = need_run
        self.metrics = metrics
        self.unfold_multi_result = unfold_multi_result
        self.run_clustering_arbiter_metric = run_clustering_arbiter_metric

        self.default_metrics = {
            consts.BINARY: consts.ALL_BINARY_METRICS,
            consts.MULTY: consts.ALL_MULTI_METRICS,
            consts.REGRESSION: consts.ALL_REGRESSION_METRICS,
            consts.CLUSTERING: consts.ALL_CLUSTER_METRICS
        }

        self.allowed_metrics = {
            consts.BINARY: consts.ALL_BINARY_METRICS,
            consts.MULTY: consts.ALL_MULTI_METRICS,
            consts.REGRESSION: consts.ALL_REGRESSION_METRICS,
            consts.CLUSTERING: consts.ALL_CLUSTER_METRICS
        }

    def _use_single_value_default_metrics(self):

        self.default_metrics = {
            consts.BINARY: consts.DEFAULT_BINARY_METRIC,
            consts.MULTY: consts.DEFAULT_MULTI_METRIC,
            consts.REGRESSION: consts.DEFAULT_REGRESSION_METRIC,
            consts.CLUSTERING: consts.DEFAULT_CLUSTER_METRIC
        }

    def _check_valid_metric(self, metrics_list):

        metric_list = consts.ALL_METRIC_NAME
        alias_name: dict = consts.ALIAS

        full_name_list = []

        metrics_list = [str.lower(i) for i in metrics_list]

        for metric in metrics_list:

            if metric in metric_list:
                if metric not in full_name_list:
                    full_name_list.append(metric)
                continue

            valid_flag = False
            for alias, full_name in alias_name.items():
                if metric in alias:
                    if full_name not in full_name_list:
                        full_name_list.append(full_name)
                    valid_flag = True
                    break

            if not valid_flag:
                raise ValueError('metric {} is not supported'.format(metric))

        allowed_metrics = self.allowed_metrics[self.eval_type]

        for m in full_name_list:
            if m not in allowed_metrics:
                raise ValueError('metric {} is not used for {} task'.format(m, self.eval_type))

        if consts.RECALL in full_name_list and consts.PRECISION not in full_name_list:
            full_name_list.append(consts.PRECISION)

        if consts.RECALL not in full_name_list and consts.PRECISION in full_name_list:
            full_name_list.append(consts.RECALL)

        return full_name_list

    def check(self):

        descr = "evaluate param's "
        self.eval_type = self.check_and_change_lower(self.eval_type,
                                                       [consts.BINARY, consts.MULTY, consts.REGRESSION,
                                                        consts.CLUSTERING],
                                                       descr)

        if type(self.pos_label).__name__ not in ["str", "float", "int"]:
            raise ValueError(
                "evaluate param's pos_label {} not supported, should be str or float or int type".format(
                    self.pos_label))

        if type(self.need_run).__name__ != "bool":
            raise ValueError(
                "evaluate param's need_run {} not supported, should be bool".format(
                    self.need_run))

        if self.metrics is None or len(self.metrics) == 0:
            self.metrics = self.default_metrics[self.eval_type]
            LOGGER.warning('use default metric {} for eval type {}'.format(self.metrics, self.eval_type)) 

        self.check_boolean(self.unfold_multi_result, 'multi_result_unfold')

        self.metrics = self._check_valid_metric(self.metrics)

        LOGGER.info("Finish evaluation parameter check!")

        return True

    def check_single_value_default_metric(self):
        self._use_single_value_default_metrics()

        # in validation strategy, psi f1-score and confusion-mat pr-quantile are not supported in cur version
        if self.metrics is None or len(self.metrics) == 0:
            self.metrics = self.default_metrics[self.eval_type]
            LOGGER.warning('use default metric {} for eval type {}'.format(self.metrics, self.eval_type))

        ban_metric = [consts.PSI, consts.F1_SCORE, consts.CONFUSION_MAT, consts.QUANTILE_PR]
        for metric in self.metrics:
            if metric in ban_metric:
                self.metrics.remove(metric)
        self.check()
__init__(self, eval_type='binary', pos_label=1, need_run=True, metrics=None, run_clustering_arbiter_metric=False, unfold_multi_result=False) special
Source code in federatedml/param/evaluation_param.py
def __init__(self, eval_type="binary", pos_label=1, need_run=True, metrics=None,
             run_clustering_arbiter_metric=False, unfold_multi_result=False):
    super().__init__()
    self.eval_type = eval_type
    self.pos_label = pos_label
    self.need_run = need_run
    self.metrics = metrics
    self.unfold_multi_result = unfold_multi_result
    self.run_clustering_arbiter_metric = run_clustering_arbiter_metric

    self.default_metrics = {
        consts.BINARY: consts.ALL_BINARY_METRICS,
        consts.MULTY: consts.ALL_MULTI_METRICS,
        consts.REGRESSION: consts.ALL_REGRESSION_METRICS,
        consts.CLUSTERING: consts.ALL_CLUSTER_METRICS
    }

    self.allowed_metrics = {
        consts.BINARY: consts.ALL_BINARY_METRICS,
        consts.MULTY: consts.ALL_MULTI_METRICS,
        consts.REGRESSION: consts.ALL_REGRESSION_METRICS,
        consts.CLUSTERING: consts.ALL_CLUSTER_METRICS
    }
check(self)
Source code in federatedml/param/evaluation_param.py
def check(self):

    descr = "evaluate param's "
    self.eval_type = self.check_and_change_lower(self.eval_type,
                                                   [consts.BINARY, consts.MULTY, consts.REGRESSION,
                                                    consts.CLUSTERING],
                                                   descr)

    if type(self.pos_label).__name__ not in ["str", "float", "int"]:
        raise ValueError(
            "evaluate param's pos_label {} not supported, should be str or float or int type".format(
                self.pos_label))

    if type(self.need_run).__name__ != "bool":
        raise ValueError(
            "evaluate param's need_run {} not supported, should be bool".format(
                self.need_run))

    if self.metrics is None or len(self.metrics) == 0:
        self.metrics = self.default_metrics[self.eval_type]
        LOGGER.warning('use default metric {} for eval type {}'.format(self.metrics, self.eval_type)) 

    self.check_boolean(self.unfold_multi_result, 'multi_result_unfold')

    self.metrics = self._check_valid_metric(self.metrics)

    LOGGER.info("Finish evaluation parameter check!")

    return True
check_single_value_default_metric(self)
Source code in federatedml/param/evaluation_param.py
def check_single_value_default_metric(self):
    self._use_single_value_default_metrics()

    # in validation strategy, psi f1-score and confusion-mat pr-quantile are not supported in cur version
    if self.metrics is None or len(self.metrics) == 0:
        self.metrics = self.default_metrics[self.eval_type]
        LOGGER.warning('use default metric {} for eval type {}'.format(self.metrics, self.eval_type))

    ban_metric = [consts.PSI, consts.F1_SCORE, consts.CONFUSION_MAT, consts.QUANTILE_PR]
    for metric in self.metrics:
        if metric in ban_metric:
            self.metrics.remove(metric)
    self.check()
feature_binning_param
Classes
TransformParam (BaseParam)

Define how to transfer the cols

Parameters:

Name Type Description Default
transform_cols list of column index, default: -1

Specify which columns need to be transform. If column index is None, None of columns will be transformed. If it is -1, it will use same columns as cols in binning module.

-1
transform_names list of string, default: []

Specify which columns need to calculated. Each element in the list represent for a column name in header.

None
transform_type {'bin_num', 'woe', None}

Specify which value these columns going to replace. 1. bin_num: Transfer original feature value to bin index in which this value belongs to. 2. woe: This is valid for guest party only. It will replace original value to its woe value 3. None: nothing will be replaced.

'bin_num'
Source code in federatedml/param/feature_binning_param.py
class TransformParam(BaseParam):
    """
    Define how to transfer the cols

    Parameters
    ----------
    transform_cols : list of column index, default: -1
        Specify which columns need to be transform. If column index is None, None of columns will be transformed.
        If it is -1, it will use same columns as cols in binning module.

    transform_names: list of string, default: []
        Specify which columns need to calculated. Each element in the list represent for a column name in header.


    transform_type: {'bin_num', 'woe', None}
        Specify which value these columns going to replace.
         1. bin_num: Transfer original feature value to bin index in which this value belongs to.
         2. woe: This is valid for guest party only. It will replace original value to its woe value
         3. None: nothing will be replaced.
    """

    def __init__(self, transform_cols=-1, transform_names=None, transform_type="bin_num"):
        super(TransformParam, self).__init__()
        self.transform_cols = transform_cols
        self.transform_names = transform_names
        self.transform_type = transform_type

    def check(self):
        descr = "Transform Param's "
        if self.transform_cols is not None and self.transform_cols != -1:
            self.check_defined_type(self.transform_cols, descr, ['list'])
        self.check_defined_type(self.transform_names, descr, ['list', "NoneType"])
        if self.transform_names is not None:
            for name in self.transform_names:
                if not isinstance(name, str):
                    raise ValueError("Elements in transform_names should be string type")
        self.check_valid_value(self.transform_type, descr, ['bin_num', 'woe', None])
__init__(self, transform_cols=-1, transform_names=None, transform_type='bin_num') special
Source code in federatedml/param/feature_binning_param.py
def __init__(self, transform_cols=-1, transform_names=None, transform_type="bin_num"):
    super(TransformParam, self).__init__()
    self.transform_cols = transform_cols
    self.transform_names = transform_names
    self.transform_type = transform_type
check(self)
Source code in federatedml/param/feature_binning_param.py
def check(self):
    descr = "Transform Param's "
    if self.transform_cols is not None and self.transform_cols != -1:
        self.check_defined_type(self.transform_cols, descr, ['list'])
    self.check_defined_type(self.transform_names, descr, ['list', "NoneType"])
    if self.transform_names is not None:
        for name in self.transform_names:
            if not isinstance(name, str):
                raise ValueError("Elements in transform_names should be string type")
    self.check_valid_value(self.transform_type, descr, ['bin_num', 'woe', None])
OptimalBinningParam (BaseParam)

Indicate optimal binning params

Parameters:

Name Type Description Default
metric_method str, default: "iv"

The algorithm metric method. Support iv, gini, ks, chi-square

'iv'
min_bin_pct float, default: 0.05

The minimum percentage of each bucket

0.05
max_bin_pct float, default: 1.0

The maximum percentage of each bucket

1.0
init_bin_nums int, default 100

Number of bins when initialize

1000
mixture bool, default: True

Whether each bucket need event and non-event records

True
init_bucket_method str default: quantile

Init bucket methods. Accept quantile and bucket.

'quantile'
Source code in federatedml/param/feature_binning_param.py
class OptimalBinningParam(BaseParam):
    """
    Indicate optimal binning params

    Parameters
    ----------
    metric_method: str, default: "iv"
        The algorithm metric method. Support iv, gini, ks, chi-square


    min_bin_pct: float, default: 0.05
        The minimum percentage of each bucket

    max_bin_pct: float, default: 1.0
        The maximum percentage of each bucket

    init_bin_nums: int, default 100
        Number of bins when initialize

    mixture: bool, default: True
        Whether each bucket need event and non-event records

    init_bucket_method: str default: quantile
        Init bucket methods. Accept quantile and bucket.

    """

    def __init__(self, metric_method='iv', min_bin_pct=0.05, max_bin_pct=1.0,
                 init_bin_nums=1000, mixture=True, init_bucket_method='quantile'):
        super().__init__()
        self.init_bucket_method = init_bucket_method
        self.metric_method = metric_method
        self.max_bin = None
        self.mixture = mixture
        self.max_bin_pct = max_bin_pct
        self.min_bin_pct = min_bin_pct
        self.init_bin_nums = init_bin_nums
        self.adjustment_factor = None

    def check(self):
        descr = "hetero binning's optimal binning param's"
        self.check_string(self.metric_method, descr)

        self.metric_method = self.metric_method.lower()
        if self.metric_method in ['chi_square', 'chi-square']:
            self.metric_method = 'chi_square'
        self.check_valid_value(self.metric_method, descr, ['iv', 'gini', 'chi_square', 'ks'])
        self.check_positive_integer(self.init_bin_nums, descr)

        self.init_bucket_method = self.init_bucket_method.lower()
        self.check_valid_value(self.init_bucket_method, descr, ['quantile', 'bucket'])

        if self.max_bin_pct not in [1, 0]:
            self.check_decimal_float(self.max_bin_pct, descr)
        if self.min_bin_pct not in [1, 0]:
            self.check_decimal_float(self.min_bin_pct, descr)
        if self.min_bin_pct > self.max_bin_pct:
            raise ValueError("Optimal binning's min_bin_pct should less or equal than max_bin_pct")

        self.check_boolean(self.mixture, descr)
        self.check_positive_integer(self.init_bin_nums, descr)
__init__(self, metric_method='iv', min_bin_pct=0.05, max_bin_pct=1.0, init_bin_nums=1000, mixture=True, init_bucket_method='quantile') special
Source code in federatedml/param/feature_binning_param.py
def __init__(self, metric_method='iv', min_bin_pct=0.05, max_bin_pct=1.0,
             init_bin_nums=1000, mixture=True, init_bucket_method='quantile'):
    super().__init__()
    self.init_bucket_method = init_bucket_method
    self.metric_method = metric_method
    self.max_bin = None
    self.mixture = mixture
    self.max_bin_pct = max_bin_pct
    self.min_bin_pct = min_bin_pct
    self.init_bin_nums = init_bin_nums
    self.adjustment_factor = None
check(self)
Source code in federatedml/param/feature_binning_param.py
def check(self):
    descr = "hetero binning's optimal binning param's"
    self.check_string(self.metric_method, descr)

    self.metric_method = self.metric_method.lower()
    if self.metric_method in ['chi_square', 'chi-square']:
        self.metric_method = 'chi_square'
    self.check_valid_value(self.metric_method, descr, ['iv', 'gini', 'chi_square', 'ks'])
    self.check_positive_integer(self.init_bin_nums, descr)

    self.init_bucket_method = self.init_bucket_method.lower()
    self.check_valid_value(self.init_bucket_method, descr, ['quantile', 'bucket'])

    if self.max_bin_pct not in [1, 0]:
        self.check_decimal_float(self.max_bin_pct, descr)
    if self.min_bin_pct not in [1, 0]:
        self.check_decimal_float(self.min_bin_pct, descr)
    if self.min_bin_pct > self.max_bin_pct:
        raise ValueError("Optimal binning's min_bin_pct should less or equal than max_bin_pct")

    self.check_boolean(self.mixture, descr)
    self.check_positive_integer(self.init_bin_nums, descr)
FeatureBinningParam (BaseParam)

Define the feature binning method

Parameters:

Name Type Description Default
method str, 'quantile', 'bucket' or 'optimal', default: 'quantile'

Binning method.

'quantile'
compress_thres int, default: 10000

When the number of saved summaries exceed this threshold, it will call its compress function

10000
head_size int, default: 10000

The buffer size to store inserted observations. When head list reach this buffer size, the QuantileSummaries object start to generate summary(or stats) and insert into its sampled list.

10000
error float, 0 <= error < 1 default: 0.001

The error of tolerance of binning. The final split point comes from original data, and the rank of this value is close to the exact rank. More precisely, floor((p - 2 * error) * N) <= rank(x) <= ceil((p + 2 * error) * N) where p is the quantile in float, and N is total number of data.

0.0001
bin_num int, bin_num > 0, default: 10

The max bin number for binning

10
bin_indexes list of int or int, default: -1

Specify which columns need to be binned. -1 represent for all columns. If you need to indicate specific cols, provide a list of header index instead of -1.

-1
bin_names list of string, default: []

Specify which columns need to calculated. Each element in the list represent for a column name in header.

None
adjustment_factor float, default: 0.5

the adjustment factor when calculating WOE. This is useful when there is no event or non-event in a bin. Please note that this parameter will NOT take effect for setting in host.

0.5
category_indexes list of int or int, default: []

Specify which columns are category features. -1 represent for all columns. List of int indicate a set of such features. For category features, bin_obj will take its original values as split_points and treat them as have been binned. If this is not what you expect, please do NOT put it into this parameters.

The number of categories should not exceed bin_num set above.

None
category_names list of string, default: []

Use column names to specify category features. Each element in the list represent for a column name in header.

None
local_only bool, default: False

Whether just provide binning method to guest party. If true, host party will do nothing. Warnings: This parameter will be deprecated in future version.

False
transform_param TransformParam

Define how to transfer the binned data.

<federatedml.param.feature_binning_param.TransformParam object at 0x7f27551ddb10>
need_run bool, default True

Indicate if this module needed to be run

True
skip_static bool, default False

If true, binning will not calculate iv, woe etc. In this case, optimal-binning will not be supported.

False
Source code in federatedml/param/feature_binning_param.py
class FeatureBinningParam(BaseParam):
    """
    Define the feature binning method

    Parameters
    ----------
    method : str, 'quantile', 'bucket' or 'optimal', default: 'quantile'
        Binning method.

    compress_thres: int, default: 10000
        When the number of saved summaries exceed this threshold, it will call its compress function

    head_size: int, default: 10000
        The buffer size to store inserted observations. When head list reach this buffer size, the
        QuantileSummaries object start to generate summary(or stats) and insert into its sampled list.

    error: float, 0 <= error < 1 default: 0.001
        The error of tolerance of binning. The final split point comes from original data, and the rank
        of this value is close to the exact rank. More precisely,
        floor((p - 2 * error) * N) <= rank(x) <= ceil((p + 2 * error) * N)
        where p is the quantile in float, and N is total number of data.

    bin_num: int, bin_num > 0, default: 10
        The max bin number for binning

    bin_indexes : list of int or int, default: -1
        Specify which columns need to be binned. -1 represent for all columns. If you need to indicate specific
        cols, provide a list of header index instead of -1.

    bin_names : list of string, default: []
        Specify which columns need to calculated. Each element in the list represent for a column name in header.

    adjustment_factor : float, default: 0.5
        the adjustment factor when calculating WOE. This is useful when there is no event or non-event in
        a bin. Please note that this parameter will NOT take effect for setting in host.

    category_indexes : list of int or int, default: []
        Specify which columns are category features. -1 represent for all columns. List of int indicate a set of
        such features. For category features, bin_obj will take its original values as split_points and treat them
        as have been binned. If this is not what you expect, please do NOT put it into this parameters.

        The number of categories should not exceed bin_num set above.

    category_names : list of string, default: []
        Use column names to specify category features. Each element in the list represent for a column name in header.

    local_only : bool, default: False
        Whether just provide binning method to guest party. If true, host party will do nothing.
        Warnings: This parameter will be deprecated in future version.

    transform_param: TransformParam
        Define how to transfer the binned data.

    need_run: bool, default True
        Indicate if this module needed to be run

    skip_static: bool, default False
        If true, binning will not calculate iv, woe etc. In this case, optimal-binning
        will not be supported.

    """

    def __init__(self, method=consts.QUANTILE,
                 compress_thres=consts.DEFAULT_COMPRESS_THRESHOLD,
                 head_size=consts.DEFAULT_HEAD_SIZE,
                 error=consts.DEFAULT_RELATIVE_ERROR,
                 bin_num=consts.G_BIN_NUM, bin_indexes=-1, bin_names=None, adjustment_factor=0.5,
                 transform_param=TransformParam(),
                 local_only=False,
                 category_indexes=None, category_names=None,
                 need_run=True, skip_static=False):
        super(FeatureBinningParam, self).__init__()
        self.method = method
        self.compress_thres = compress_thres
        self.head_size = head_size
        self.error = error
        self.adjustment_factor = adjustment_factor
        self.bin_num = bin_num
        self.bin_indexes = bin_indexes
        self.bin_names = bin_names
        self.category_indexes = category_indexes
        self.category_names = category_names
        self.transform_param = copy.deepcopy(transform_param)
        self.need_run = need_run
        self.skip_static = skip_static
        self.local_only = local_only

    def check(self):
        descr = "Binning param's"
        self.check_string(self.method, descr)
        self.method = self.method.lower()
        self.check_positive_integer(self.compress_thres, descr)
        self.check_positive_integer(self.head_size, descr)
        self.check_decimal_float(self.error, descr)
        self.check_positive_integer(self.bin_num, descr)
        if self.bin_indexes != -1:
            self.check_defined_type(self.bin_indexes, descr, ['list', 'RepeatedScalarContainer', "NoneType"])
        self.check_defined_type(self.bin_names, descr, ['list', "NoneType"])
        self.check_defined_type(self.category_indexes, descr, ['list', "NoneType"])
        self.check_defined_type(self.category_names, descr, ['list', "NoneType"])
        self.check_open_unit_interval(self.adjustment_factor, descr)
        self.check_boolean(self.local_only, descr)
__init__(self, method='quantile', compress_thres=10000, head_size=10000, error=0.0001, bin_num=10, bin_indexes=-1, bin_names=None, adjustment_factor=0.5, transform_param=<federatedml.param.feature_binning_param.TransformParam object at 0x7f27551ddb10>, local_only=False, category_indexes=None, category_names=None, need_run=True, skip_static=False) special
Source code in federatedml/param/feature_binning_param.py
def __init__(self, method=consts.QUANTILE,
             compress_thres=consts.DEFAULT_COMPRESS_THRESHOLD,
             head_size=consts.DEFAULT_HEAD_SIZE,
             error=consts.DEFAULT_RELATIVE_ERROR,
             bin_num=consts.G_BIN_NUM, bin_indexes=-1, bin_names=None, adjustment_factor=0.5,
             transform_param=TransformParam(),
             local_only=False,
             category_indexes=None, category_names=None,
             need_run=True, skip_static=False):
    super(FeatureBinningParam, self).__init__()
    self.method = method
    self.compress_thres = compress_thres
    self.head_size = head_size
    self.error = error
    self.adjustment_factor = adjustment_factor
    self.bin_num = bin_num
    self.bin_indexes = bin_indexes
    self.bin_names = bin_names
    self.category_indexes = category_indexes
    self.category_names = category_names
    self.transform_param = copy.deepcopy(transform_param)
    self.need_run = need_run
    self.skip_static = skip_static
    self.local_only = local_only
check(self)
Source code in federatedml/param/feature_binning_param.py
def check(self):
    descr = "Binning param's"
    self.check_string(self.method, descr)
    self.method = self.method.lower()
    self.check_positive_integer(self.compress_thres, descr)
    self.check_positive_integer(self.head_size, descr)
    self.check_decimal_float(self.error, descr)
    self.check_positive_integer(self.bin_num, descr)
    if self.bin_indexes != -1:
        self.check_defined_type(self.bin_indexes, descr, ['list', 'RepeatedScalarContainer', "NoneType"])
    self.check_defined_type(self.bin_names, descr, ['list', "NoneType"])
    self.check_defined_type(self.category_indexes, descr, ['list', "NoneType"])
    self.check_defined_type(self.category_names, descr, ['list', "NoneType"])
    self.check_open_unit_interval(self.adjustment_factor, descr)
    self.check_boolean(self.local_only, descr)
HeteroFeatureBinningParam (FeatureBinningParam)
Source code in federatedml/param/feature_binning_param.py
class HeteroFeatureBinningParam(FeatureBinningParam):
    def __init__(self, method=consts.QUANTILE,
                 compress_thres=consts.DEFAULT_COMPRESS_THRESHOLD,
                 head_size=consts.DEFAULT_HEAD_SIZE,
                 error=consts.DEFAULT_RELATIVE_ERROR,
                 bin_num=consts.G_BIN_NUM, bin_indexes=-1, bin_names=None, adjustment_factor=0.5,
                 transform_param=TransformParam(), optimal_binning_param=OptimalBinningParam(),
                 local_only=False, category_indexes=None, category_names=None,
                 encrypt_param=EncryptParam(),
                 need_run=True, skip_static=False):
        super(HeteroFeatureBinningParam, self).__init__(method=method, compress_thres=compress_thres,
                                                        head_size=head_size, error=error,
                                                        bin_num=bin_num, bin_indexes=bin_indexes,
                                                        bin_names=bin_names, adjustment_factor=adjustment_factor,
                                                        transform_param=transform_param,
                                                        category_indexes=category_indexes,
                                                        category_names=category_names,
                                                        need_run=need_run, local_only=local_only,
                                                        skip_static=skip_static)
        self.optimal_binning_param = copy.deepcopy(optimal_binning_param)
        self.encrypt_param = encrypt_param

    def check(self):
        descr = "Hetero Binning param's"
        super(HeteroFeatureBinningParam, self).check()
        self.check_valid_value(self.method, descr, [consts.QUANTILE, consts.BUCKET, consts.OPTIMAL])
        self.optimal_binning_param.check()
        self.encrypt_param.check()
        if self.encrypt_param.method != consts.PAILLIER:
            raise ValueError("Feature Binning support Paillier encrypt method only.")
        if self.skip_static and self.method == consts.OPTIMAL:
            raise ValueError("When skip_static, optimal binning is not supported.")
        self.transform_param.check()
        if self.skip_static and self.transform_param.transform_type == 'woe':
            raise ValueError("To use woe transform, skip_static should set as False")
__init__(self, method='quantile', compress_thres=10000, head_size=10000, error=0.0001, bin_num=10, bin_indexes=-1, bin_names=None, adjustment_factor=0.5, transform_param=<federatedml.param.feature_binning_param.TransformParam object at 0x7f27551ddf90>, optimal_binning_param=<federatedml.param.feature_binning_param.OptimalBinningParam object at 0x7f27551ddfd0>, local_only=False, category_indexes=None, category_names=None, encrypt_param=<federatedml.param.encrypt_param.EncryptParam object at 0x7f27551dd3d0>, need_run=True, skip_static=False) special
Source code in federatedml/param/feature_binning_param.py
def __init__(self, method=consts.QUANTILE,
             compress_thres=consts.DEFAULT_COMPRESS_THRESHOLD,
             head_size=consts.DEFAULT_HEAD_SIZE,
             error=consts.DEFAULT_RELATIVE_ERROR,
             bin_num=consts.G_BIN_NUM, bin_indexes=-1, bin_names=None, adjustment_factor=0.5,
             transform_param=TransformParam(), optimal_binning_param=OptimalBinningParam(),
             local_only=False, category_indexes=None, category_names=None,
             encrypt_param=EncryptParam(),
             need_run=True, skip_static=False):
    super(HeteroFeatureBinningParam, self).__init__(method=method, compress_thres=compress_thres,
                                                    head_size=head_size, error=error,
                                                    bin_num=bin_num, bin_indexes=bin_indexes,
                                                    bin_names=bin_names, adjustment_factor=adjustment_factor,
                                                    transform_param=transform_param,
                                                    category_indexes=category_indexes,
                                                    category_names=category_names,
                                                    need_run=need_run, local_only=local_only,
                                                    skip_static=skip_static)
    self.optimal_binning_param = copy.deepcopy(optimal_binning_param)
    self.encrypt_param = encrypt_param
check(self)
Source code in federatedml/param/feature_binning_param.py
def check(self):
    descr = "Hetero Binning param's"
    super(HeteroFeatureBinningParam, self).check()
    self.check_valid_value(self.method, descr, [consts.QUANTILE, consts.BUCKET, consts.OPTIMAL])
    self.optimal_binning_param.check()
    self.encrypt_param.check()
    if self.encrypt_param.method != consts.PAILLIER:
        raise ValueError("Feature Binning support Paillier encrypt method only.")
    if self.skip_static and self.method == consts.OPTIMAL:
        raise ValueError("When skip_static, optimal binning is not supported.")
    self.transform_param.check()
    if self.skip_static and self.transform_param.transform_type == 'woe':
        raise ValueError("To use woe transform, skip_static should set as False")
HomoFeatureBinningParam (FeatureBinningParam)
Source code in federatedml/param/feature_binning_param.py
class HomoFeatureBinningParam(FeatureBinningParam):
    def __init__(self, method=consts.VIRTUAL_SUMMARY,
                 compress_thres=consts.DEFAULT_COMPRESS_THRESHOLD,
                 head_size=consts.DEFAULT_HEAD_SIZE,
                 error=consts.DEFAULT_RELATIVE_ERROR,
                 sample_bins=100,
                 bin_num=consts.G_BIN_NUM, bin_indexes=-1, bin_names=None, adjustment_factor=0.5,
                 transform_param=TransformParam(),
                 category_indexes=None, category_names=None,
                 need_run=True, skip_static=False, max_iter=100):
        super(HomoFeatureBinningParam, self).__init__(method=method, compress_thres=compress_thres,
                                                      head_size=head_size, error=error,
                                                      bin_num=bin_num, bin_indexes=bin_indexes,
                                                      bin_names=bin_names, adjustment_factor=adjustment_factor,
                                                      transform_param=transform_param,
                                                      category_indexes=category_indexes, category_names=category_names,
                                                      need_run=need_run,
                                                      skip_static=skip_static)
        self.sample_bins = sample_bins
        self.max_iter = max_iter

    def check(self):
        descr = "homo binning param's"
        super(HomoFeatureBinningParam, self).check()
        self.check_string(self.method, descr)
        self.method = self.method.lower()
        self.check_valid_value(self.method, descr, [consts.VIRTUAL_SUMMARY, consts.RECURSIVE_QUERY])
        self.check_positive_integer(self.max_iter, descr)
        if self.max_iter > 100:
            raise ValueError("Max iter is not allowed exceed 100")
__init__(self, method='virtual_summary', compress_thres=10000, head_size=10000, error=0.0001, sample_bins=100, bin_num=10, bin_indexes=-1, bin_names=None, adjustment_factor=0.5, transform_param=<federatedml.param.feature_binning_param.TransformParam object at 0x7f2755330110>, category_indexes=None, category_names=None, need_run=True, skip_static=False, max_iter=100) special
Source code in federatedml/param/feature_binning_param.py
def __init__(self, method=consts.VIRTUAL_SUMMARY,
             compress_thres=consts.DEFAULT_COMPRESS_THRESHOLD,
             head_size=consts.DEFAULT_HEAD_SIZE,
             error=consts.DEFAULT_RELATIVE_ERROR,
             sample_bins=100,
             bin_num=consts.G_BIN_NUM, bin_indexes=-1, bin_names=None, adjustment_factor=0.5,
             transform_param=TransformParam(),
             category_indexes=None, category_names=None,
             need_run=True, skip_static=False, max_iter=100):
    super(HomoFeatureBinningParam, self).__init__(method=method, compress_thres=compress_thres,
                                                  head_size=head_size, error=error,
                                                  bin_num=bin_num, bin_indexes=bin_indexes,
                                                  bin_names=bin_names, adjustment_factor=adjustment_factor,
                                                  transform_param=transform_param,
                                                  category_indexes=category_indexes, category_names=category_names,
                                                  need_run=need_run,
                                                  skip_static=skip_static)
    self.sample_bins = sample_bins
    self.max_iter = max_iter
check(self)
Source code in federatedml/param/feature_binning_param.py
def check(self):
    descr = "homo binning param's"
    super(HomoFeatureBinningParam, self).check()
    self.check_string(self.method, descr)
    self.method = self.method.lower()
    self.check_valid_value(self.method, descr, [consts.VIRTUAL_SUMMARY, consts.RECURSIVE_QUERY])
    self.check_positive_integer(self.max_iter, descr)
    if self.max_iter > 100:
        raise ValueError("Max iter is not allowed exceed 100")
feature_imputation_param
Classes
FeatureImputationParam (BaseParam)

Define feature imputation parameters

Parameters:

Name Type Description Default
default_value None or single object type or list

the value to replace missing value. if None, it will use default value defined in federatedml/feature/imputer.py, if single object, will fill missing value with this object, if list, it's length should be the same as input data' feature dimension, means that if some column happens to have missing values, it will replace it the value by element in the identical position of this list.

0
missing_fill_method [None, 'min', 'max', 'mean', 'designated']

the method to replace missing value

None
col_missing_fill_method None or dict of (column name, missing_fill_method) pairs

specifies method to replace missing value for each column; any column not specified will take missing_fill_method, if missing_fill_method is None, unspecified column will not be imputed;

None
missing_impute None or list

element of list can be any type, or auto generated if value is None, define which values to be consider as missing, default: None

None
need_run bool, default True

need run or not

True
Source code in federatedml/param/feature_imputation_param.py
class FeatureImputationParam(BaseParam):
    """
    Define feature imputation parameters

    Parameters
    ----------

    default_value : None or single object type or list
        the value to replace missing value.
        if None, it will use default value defined in federatedml/feature/imputer.py,
        if single object, will fill missing value with this object,
        if list, it's length should be the same as input data' feature dimension,
            means that if some column happens to have missing values, it will replace it
            the value by element in the identical position of this list.

    missing_fill_method : [None, 'min', 'max', 'mean', 'designated']
        the method to replace missing value

    col_missing_fill_method: None or dict of (column name, missing_fill_method) pairs
        specifies method to replace missing value for each column;
        any column not specified will take missing_fill_method,
        if missing_fill_method is None, unspecified column will not be imputed;

    missing_impute : None or list
        element of list can be any type, or auto generated if value is None, define which values to be consider as missing, default: None

    need_run: bool, default True
        need run or not

    """

    def __init__(self, default_value=0, missing_fill_method=None, col_missing_fill_method=None,
                 missing_impute=None, need_run=True):
        self.default_value = default_value
        self.missing_fill_method = missing_fill_method
        self.col_missing_fill_method = col_missing_fill_method
        self.missing_impute = missing_impute
        self.need_run = need_run

    def check(self):

        descr = "feature imputation param's "

        self.check_boolean(self.need_run, descr+"need_run")

        if self.missing_fill_method is not None:
            self.missing_fill_method = self.check_and_change_lower(self.missing_fill_method,
                                                                   ['min', 'max', 'mean', 'designated'],
                                                                   f"{descr}missing_fill_method ")
        if self.col_missing_fill_method:
            if not isinstance(self.col_missing_fill_method, dict):
                raise ValueError(f"{descr}col_missing_fill_method should be a dict")
            for k, v in self.col_missing_fill_method.items():
                if not isinstance(k, str):
                    raise ValueError(f"{descr}col_missing_fill_method should contain str key(s) only")
                v = self.check_and_change_lower(v,
                                                ['min', 'max', 'mean', 'designated'],
                                                f"per column method specified in {descr} col_missing_fill_method dict")
                self.col_missing_fill_method[k] = v
        if self.missing_impute:
            if not isinstance(self.missing_impute, list):
                raise ValueError(f"{descr}missing_impute must be None or list.")

        return True
__init__(self, default_value=0, missing_fill_method=None, col_missing_fill_method=None, missing_impute=None, need_run=True) special
Source code in federatedml/param/feature_imputation_param.py
def __init__(self, default_value=0, missing_fill_method=None, col_missing_fill_method=None,
             missing_impute=None, need_run=True):
    self.default_value = default_value
    self.missing_fill_method = missing_fill_method
    self.col_missing_fill_method = col_missing_fill_method
    self.missing_impute = missing_impute
    self.need_run = need_run
check(self)
Source code in federatedml/param/feature_imputation_param.py
def check(self):

    descr = "feature imputation param's "

    self.check_boolean(self.need_run, descr+"need_run")

    if self.missing_fill_method is not None:
        self.missing_fill_method = self.check_and_change_lower(self.missing_fill_method,
                                                               ['min', 'max', 'mean', 'designated'],
                                                               f"{descr}missing_fill_method ")
    if self.col_missing_fill_method:
        if not isinstance(self.col_missing_fill_method, dict):
            raise ValueError(f"{descr}col_missing_fill_method should be a dict")
        for k, v in self.col_missing_fill_method.items():
            if not isinstance(k, str):
                raise ValueError(f"{descr}col_missing_fill_method should contain str key(s) only")
            v = self.check_and_change_lower(v,
                                            ['min', 'max', 'mean', 'designated'],
                                            f"per column method specified in {descr} col_missing_fill_method dict")
            self.col_missing_fill_method[k] = v
    if self.missing_impute:
        if not isinstance(self.missing_impute, list):
            raise ValueError(f"{descr}missing_impute must be None or list.")

    return True
feature_selection_param
deprecated_param_list
Classes
UniqueValueParam (BaseParam)

Use the difference between max-value and min-value to judge.

Parameters:

Name Type Description Default
eps float, default: 1e-5

The column(s) will be filtered if its difference is smaller than eps.

1e-05
Source code in federatedml/param/feature_selection_param.py
class UniqueValueParam(BaseParam):
    """
    Use the difference between max-value and min-value to judge.

    Parameters
    ----------
    eps : float, default: 1e-5
        The column(s) will be filtered if its difference is smaller than eps.
    """

    def __init__(self, eps=1e-5):
        self.eps = eps

    def check(self):
        descr = "Unique value param's"
        self.check_positive_number(self.eps, descr)
        return True
__init__(self, eps=1e-05) special
Source code in federatedml/param/feature_selection_param.py
def __init__(self, eps=1e-5):
    self.eps = eps
check(self)
Source code in federatedml/param/feature_selection_param.py
def check(self):
    descr = "Unique value param's"
    self.check_positive_number(self.eps, descr)
    return True
IVValueSelectionParam (BaseParam)

Use information values to select features.

Parameters:

Name Type Description Default
value_threshold float, default: 1.0

Used if iv_value_thres method is used in feature selection.

0.0
host_thresholds List of float or None, default: None

Set threshold for different host. If None, use same threshold as guest. If provided, the order should map with the host id setting.

None
Source code in federatedml/param/feature_selection_param.py
class IVValueSelectionParam(BaseParam):
    """
    Use information values to select features.

    Parameters
    ----------
    value_threshold: float, default: 1.0
        Used if iv_value_thres method is used in feature selection.

    host_thresholds: List of float or None, default: None
        Set threshold for different host. If None, use same threshold as guest. If provided, the order should map with
        the host id setting.

    """

    def __init__(self, value_threshold=0.0, host_thresholds=None, local_only=False):
        super().__init__()
        self.value_threshold = value_threshold
        self.host_thresholds = host_thresholds
        self.local_only = local_only

    def check(self):
        if not isinstance(self.value_threshold, (float, int)):
            raise ValueError("IV selection param's value_threshold should be float or int")

        if self.host_thresholds is not None:
            if not isinstance(self.host_thresholds, list):
                raise ValueError("IV selection param's host_threshold should be list or None")

        if not isinstance(self.local_only, bool):
            raise ValueError("IV selection param's local_only should be bool")

        return True
__init__(self, value_threshold=0.0, host_thresholds=None, local_only=False) special
Source code in federatedml/param/feature_selection_param.py
def __init__(self, value_threshold=0.0, host_thresholds=None, local_only=False):
    super().__init__()
    self.value_threshold = value_threshold
    self.host_thresholds = host_thresholds
    self.local_only = local_only
check(self)
Source code in federatedml/param/feature_selection_param.py
def check(self):
    if not isinstance(self.value_threshold, (float, int)):
        raise ValueError("IV selection param's value_threshold should be float or int")

    if self.host_thresholds is not None:
        if not isinstance(self.host_thresholds, list):
            raise ValueError("IV selection param's host_threshold should be list or None")

    if not isinstance(self.local_only, bool):
        raise ValueError("IV selection param's local_only should be bool")

    return True
IVPercentileSelectionParam (BaseParam)

Use information values to select features.

Parameters:

Name Type Description Default
percentile_threshold float

0 <= percentile_threshold <= 1.0, default: 1.0, Percentile threshold for iv_percentile method

1.0
Source code in federatedml/param/feature_selection_param.py
class IVPercentileSelectionParam(BaseParam):
    """
    Use information values to select features.

    Parameters
    ----------
    percentile_threshold: float
        0 <= percentile_threshold <= 1.0, default: 1.0, Percentile threshold for iv_percentile method
    """

    def __init__(self, percentile_threshold=1.0, local_only=False):
        super().__init__()
        self.percentile_threshold = percentile_threshold
        self.local_only = local_only

    def check(self):
        descr = "IV selection param's"
        if self.percentile_threshold != 0 or self.percentile_threshold != 1:
            self.check_decimal_float(self.percentile_threshold, descr)
        self.check_boolean(self.local_only, descr)
        return True
__init__(self, percentile_threshold=1.0, local_only=False) special
Source code in federatedml/param/feature_selection_param.py
def __init__(self, percentile_threshold=1.0, local_only=False):
    super().__init__()
    self.percentile_threshold = percentile_threshold
    self.local_only = local_only
check(self)
Source code in federatedml/param/feature_selection_param.py
def check(self):
    descr = "IV selection param's"
    if self.percentile_threshold != 0 or self.percentile_threshold != 1:
        self.check_decimal_float(self.percentile_threshold, descr)
    self.check_boolean(self.local_only, descr)
    return True
IVTopKParam (BaseParam)

Use information values to select features.

Parameters:

Name Type Description Default
k int

should be greater than 0, default: 10, Percentile threshold for iv_percentile method

10
Source code in federatedml/param/feature_selection_param.py
class IVTopKParam(BaseParam):
    """
    Use information values to select features.

    Parameters
    ----------
    k: int
        should be greater than 0, default: 10, Percentile threshold for iv_percentile method
    """

    def __init__(self, k=10, local_only=False):
        super().__init__()
        self.k = k
        self.local_only = local_only

    def check(self):
        descr = "IV selection param's"
        self.check_positive_integer(self.k, descr)
        self.check_boolean(self.local_only, descr)
        return True
__init__(self, k=10, local_only=False) special
Source code in federatedml/param/feature_selection_param.py
def __init__(self, k=10, local_only=False):
    super().__init__()
    self.k = k
    self.local_only = local_only
check(self)
Source code in federatedml/param/feature_selection_param.py
def check(self):
    descr = "IV selection param's"
    self.check_positive_integer(self.k, descr)
    self.check_boolean(self.local_only, descr)
    return True
VarianceOfCoeSelectionParam (BaseParam)

Use coefficient of variation to select features. When judging, the absolute value will be used.

Parameters:

Name Type Description Default
value_threshold float, default: 1.0

Used if coefficient_of_variation_value_thres method is used in feature selection. Filter those columns who has smaller coefficient of variance than the threshold.

1.0
Source code in federatedml/param/feature_selection_param.py
class VarianceOfCoeSelectionParam(BaseParam):
    """
    Use coefficient of variation to select features. When judging, the absolute value will be used.

    Parameters
    ----------
    value_threshold: float, default: 1.0
        Used if coefficient_of_variation_value_thres method is used in feature selection. Filter those
        columns who has smaller coefficient of variance than the threshold.

    """

    def __init__(self, value_threshold=1.0):
        self.value_threshold = value_threshold

    def check(self):
        descr = "Coff of Variances param's"
        self.check_positive_number(self.value_threshold, descr)
        return True
__init__(self, value_threshold=1.0) special
Source code in federatedml/param/feature_selection_param.py
def __init__(self, value_threshold=1.0):
    self.value_threshold = value_threshold
check(self)
Source code in federatedml/param/feature_selection_param.py
def check(self):
    descr = "Coff of Variances param's"
    self.check_positive_number(self.value_threshold, descr)
    return True
OutlierColsSelectionParam (BaseParam)

Given percentile and threshold. Judge if this quantile point is larger than threshold. Filter those larger ones.

Parameters:

Name Type Description Default
percentile float, [0., 1.] default: 1.0

The percentile points to compare.

1.0
upper_threshold float, default: 1.0

Percentile threshold for coefficient_of_variation_percentile method

1.0
Source code in federatedml/param/feature_selection_param.py
class OutlierColsSelectionParam(BaseParam):
    """
    Given percentile and threshold. Judge if this quantile point is larger than threshold. Filter those larger ones.

    Parameters
    ----------
    percentile: float, [0., 1.] default: 1.0
        The percentile points to compare.

    upper_threshold: float, default: 1.0
        Percentile threshold for coefficient_of_variation_percentile method

    """

    def __init__(self, percentile=1.0, upper_threshold=1.0):
        self.percentile = percentile
        self.upper_threshold = upper_threshold

    def check(self):
        descr = "Outlier Filter param's"
        self.check_decimal_float(self.percentile, descr)
        self.check_defined_type(self.upper_threshold, descr, ['float', 'int'])
        return True
__init__(self, percentile=1.0, upper_threshold=1.0) special
Source code in federatedml/param/feature_selection_param.py
def __init__(self, percentile=1.0, upper_threshold=1.0):
    self.percentile = percentile
    self.upper_threshold = upper_threshold
check(self)
Source code in federatedml/param/feature_selection_param.py
def check(self):
    descr = "Outlier Filter param's"
    self.check_decimal_float(self.percentile, descr)
    self.check_defined_type(self.upper_threshold, descr, ['float', 'int'])
    return True
CommonFilterParam (BaseParam)

All of the following parameters can set with a single value or a list of those values.

When setting one single value, it means using only one metric to filter while a list represent for using multiple metrics.

Please note that if some of the following values has been set as list, all of them should have same length. Otherwise, error will be raised. And if there exist a list type parameter, the metrics should be in list type.

Parameters:

Name Type Description Default
metrics str or list, default: depends on the specific filter

Indicate what metrics are used in this filter

required
filter_type str, default: threshold

Should be one of "threshold", "top_k" or "top_percentile"

'threshold'
take_high bool, default: True

When filtering, taking highest values or not.

True
threshold float or int, default: 1

If filter type is threshold, this is the threshold value. If it is "top_k", this is the k value. If it is top_percentile, this is the percentile threshold.

1
host_thresholds List of float or List of List of float or None, default: None

Set threshold for different host. If None, use same threshold as guest. If provided, the order should map with the host id setting.

None
select_federated bool, default: True

Whether select federated with other parties or based on local variables

True
Source code in federatedml/param/feature_selection_param.py
class CommonFilterParam(BaseParam):
    """
    All of the following parameters can set with a single value or a list of those values.
    When setting one single value, it means using only one metric to filter while
    a list represent for using multiple metrics.

    Please note that if some of the following values has been set as list, all of them
    should have same length. Otherwise, error will be raised. And if there exist a list
    type parameter, the metrics should be in list type.

    Parameters
    ----------
    metrics: str or list, default: depends on the specific filter
        Indicate what metrics are used in this filter

    filter_type: str, default: threshold
        Should be one of "threshold", "top_k" or "top_percentile"

    take_high: bool, default: True
        When filtering, taking highest values or not.

    threshold: float or int, default: 1
        If filter type is threshold, this is the threshold value.
        If it is "top_k", this is the k value.
        If it is top_percentile, this is the percentile threshold.

    host_thresholds: List of float or List of List of float or None, default: None
        Set threshold for different host. If None, use same threshold as guest. If provided, the order should map with
        the host id setting.

    select_federated: bool, default: True
        Whether select federated with other parties or based on local variables
    """

    def __init__(self, metrics, filter_type='threshold', take_high=True, threshold=1,
                 host_thresholds=None, select_federated=True):
        super().__init__()
        self.metrics = metrics
        self.filter_type = filter_type
        self.take_high = take_high
        self.threshold = threshold
        self.host_thresholds = host_thresholds
        self.select_federated = select_federated

    def check(self):
        self._convert_to_list(param_names=["filter_type", "take_high",
                                           "threshold", "select_federated"])

        for v in self.filter_type:
            if v not in ["threshold", "top_k", "top_percentile"]:
                raise ValueError('filter_type should be one of '
                                 '"threshold", "top_k", "top_percentile"')

        descr = "hetero feature selection param's"
        for v in self.take_high:
            self.check_boolean(v, descr)

        for idx, v in enumerate(self.threshold):
            if self.filter_type[idx] == "threshold":
                if not isinstance(v, (float, int)):
                    raise ValueError(descr + f"{v} should be a float or int")
            elif self.filter_type[idx] == 'top_k':
                self.check_positive_integer(v, descr)
            else:
                if not (v == 0 or v == 1):
                    self.check_decimal_float(v, descr)

        if self.host_thresholds is not None:
            if not isinstance(self.host_thresholds, list):
                raise ValueError("IV selection param's host_threshold should be list or None")

        assert isinstance(self.select_federated, list)
        for v in self.select_federated:
            self.check_boolean(v, descr)

    def _convert_to_list(self, param_names):
        if not isinstance(self.metrics, list):
            for value_name in param_names:
                v = getattr(self, value_name)
                if isinstance(v, list):
                    raise ValueError(f"{value_name}: {v} should not be a list when "
                                     f"metrics: {self.metrics} is not a list")
                setattr(self, value_name, [v])
            setattr(self, "metrics", [self.metrics])
        else:
            expected_length = len(self.metrics)
            for value_name in param_names:
                v = getattr(self, value_name)
                if isinstance(v, list):
                    if len(v) != expected_length:
                        raise ValueError(f"The parameter {v} should have same length "
                                         f"with metrics")
                else:
                    new_v = [v] * expected_length
                    setattr(self, value_name, new_v)
__init__(self, metrics, filter_type='threshold', take_high=True, threshold=1, host_thresholds=None, select_federated=True) special
Source code in federatedml/param/feature_selection_param.py
def __init__(self, metrics, filter_type='threshold', take_high=True, threshold=1,
             host_thresholds=None, select_federated=True):
    super().__init__()
    self.metrics = metrics
    self.filter_type = filter_type
    self.take_high = take_high
    self.threshold = threshold
    self.host_thresholds = host_thresholds
    self.select_federated = select_federated
check(self)
Source code in federatedml/param/feature_selection_param.py
def check(self):
    self._convert_to_list(param_names=["filter_type", "take_high",
                                       "threshold", "select_federated"])

    for v in self.filter_type:
        if v not in ["threshold", "top_k", "top_percentile"]:
            raise ValueError('filter_type should be one of '
                             '"threshold", "top_k", "top_percentile"')

    descr = "hetero feature selection param's"
    for v in self.take_high:
        self.check_boolean(v, descr)

    for idx, v in enumerate(self.threshold):
        if self.filter_type[idx] == "threshold":
            if not isinstance(v, (float, int)):
                raise ValueError(descr + f"{v} should be a float or int")
        elif self.filter_type[idx] == 'top_k':
            self.check_positive_integer(v, descr)
        else:
            if not (v == 0 or v == 1):
                self.check_decimal_float(v, descr)

    if self.host_thresholds is not None:
        if not isinstance(self.host_thresholds, list):
            raise ValueError("IV selection param's host_threshold should be list or None")

    assert isinstance(self.select_federated, list)
    for v in self.select_federated:
        self.check_boolean(v, descr)
IVFilterParam (CommonFilterParam)

Parameters:

Name Type Description Default
mul_class_merge_type str or list, default: "average"

Indicate how to merge multi-class iv results. Support "average", "min" and "max".

'average'
Source code in federatedml/param/feature_selection_param.py
class IVFilterParam(CommonFilterParam):
    """
    Parameters
    ----------
    mul_class_merge_type: str or list, default: "average"
        Indicate how to merge multi-class iv results. Support "average", "min" and "max".

    """

    def __init__(self, filter_type='threshold', threshold=1,
                 host_thresholds=None, select_federated=True, mul_class_merge_type="average"):
        super().__init__(metrics='iv', filter_type=filter_type, take_high=True, threshold=threshold,
                         host_thresholds=host_thresholds, select_federated=select_federated)
        self.mul_class_merge_type = mul_class_merge_type

    def check(self):
        super(IVFilterParam, self).check()
        self._convert_to_list(param_names=["mul_class_merge_type"])
__init__(self, filter_type='threshold', threshold=1, host_thresholds=None, select_federated=True, mul_class_merge_type='average') special
Source code in federatedml/param/feature_selection_param.py
def __init__(self, filter_type='threshold', threshold=1,
             host_thresholds=None, select_federated=True, mul_class_merge_type="average"):
    super().__init__(metrics='iv', filter_type=filter_type, take_high=True, threshold=threshold,
                     host_thresholds=host_thresholds, select_federated=select_federated)
    self.mul_class_merge_type = mul_class_merge_type
check(self)
Source code in federatedml/param/feature_selection_param.py
def check(self):
    super(IVFilterParam, self).check()
    self._convert_to_list(param_names=["mul_class_merge_type"])
CorrelationFilterParam (BaseParam)

This filter follow this specific rules:

  1. Sort all the columns from high to low based on specific metric, eg. iv.
    1. Traverse each sorted column. If there exists other columns with whom the absolute values of correlation are larger than threshold, they will be filtered.

Parameters:

Name Type Description Default
sort_metric str, default: iv

Specify which metric to be used to sort features.

'iv'
threshold float or int, default: 0.1

Correlation threshold

0.1
select_federated bool, default: True

Whether select federated with other parties or based on local variables

True
Source code in federatedml/param/feature_selection_param.py
class CorrelationFilterParam(BaseParam):
    """
    This filter follow this specific rules:
        1. Sort all the columns from high to low based on specific metric, eg. iv.
        2. Traverse each sorted column. If there exists other columns with whom the
            absolute values of correlation are larger than threshold, they will be filtered.

    Parameters
    ----------
    sort_metric: str, default: iv
        Specify which metric to be used to sort features.

    threshold: float or int, default: 0.1
        Correlation threshold

    select_federated: bool, default: True
        Whether select federated with other parties or based on local variables
    """

    def __init__(self, sort_metric='iv', threshold=0.1, select_federated=True):
        super().__init__()
        self.sort_metric = sort_metric
        self.threshold = threshold
        self.select_federated = select_federated

    def check(self):
        descr = "Correlation Filter param's"

        self.sort_metric = self.sort_metric.lower()
        support_metrics = ['iv']
        if self.sort_metric not in support_metrics:
            raise ValueError(f"sort_metric in Correlation Filter should be one of {support_metrics}")

        self.check_positive_number(self.threshold, descr)
__init__(self, sort_metric='iv', threshold=0.1, select_federated=True) special
Source code in federatedml/param/feature_selection_param.py
def __init__(self, sort_metric='iv', threshold=0.1, select_federated=True):
    super().__init__()
    self.sort_metric = sort_metric
    self.threshold = threshold
    self.select_federated = select_federated
check(self)
Source code in federatedml/param/feature_selection_param.py
def check(self):
    descr = "Correlation Filter param's"

    self.sort_metric = self.sort_metric.lower()
    support_metrics = ['iv']
    if self.sort_metric not in support_metrics:
        raise ValueError(f"sort_metric in Correlation Filter should be one of {support_metrics}")

    self.check_positive_number(self.threshold, descr)
PercentageValueParam (BaseParam)

Filter the columns that have a value that exceeds a certain percentage.

Parameters:

Name Type Description Default
upper_pct float, [0.1, 1.], default: 1.0

The upper percentage threshold for filtering, upper_pct should not be less than 0.1.

1.0
Source code in federatedml/param/feature_selection_param.py
class PercentageValueParam(BaseParam):
    """
    Filter the columns that have a value that exceeds a certain percentage.

    Parameters
    ----------
    upper_pct: float, [0.1, 1.], default: 1.0
        The upper percentage threshold for filtering, upper_pct should not be less than 0.1.

    """

    def __init__(self, upper_pct=1.0):
        super().__init__()
        self.upper_pct = upper_pct

    def check(self):
        descr = "Percentage Filter param's"
        if self.upper_pct not in [0, 1]:
            self.check_decimal_float(self.upper_pct, descr)
        if self.upper_pct < consts.PERCENTAGE_VALUE_LIMIT:
            raise ValueError(descr + f" {self.upper_pct} not supported,"
                                     f" should not be smaller than {consts.PERCENTAGE_VALUE_LIMIT}")
        return True
__init__(self, upper_pct=1.0) special
Source code in federatedml/param/feature_selection_param.py
def __init__(self, upper_pct=1.0):
    super().__init__()
    self.upper_pct = upper_pct
check(self)
Source code in federatedml/param/feature_selection_param.py
def check(self):
    descr = "Percentage Filter param's"
    if self.upper_pct not in [0, 1]:
        self.check_decimal_float(self.upper_pct, descr)
    if self.upper_pct < consts.PERCENTAGE_VALUE_LIMIT:
        raise ValueError(descr + f" {self.upper_pct} not supported,"
                                 f" should not be smaller than {consts.PERCENTAGE_VALUE_LIMIT}")
    return True
ManuallyFilterParam (BaseParam)

Specified columns that need to be filtered. If exist, it will be filtered directly, otherwise, ignore it.

Both Filter_out or left parameters only works for this specific filter. For instances, if you set some columns left in this filter but those columns are filtered by other filters, those columns will NOT left in final.

Please note that (left_col_indexes & left_col_names) cannot use with (filter_out_indexes & filter_out_names) simultaneously.

Parameters:

Name Type Description Default
filter_out_indexes list of int, default: None

Specify columns' indexes to be filtered out

None
filter_out_names list of string, default: None

Specify columns' names to be filtered out

None
left_col_indexes list of int, default: None

Specify left_col_index

None
left_col_names list of string, default: None

Specify left col names

None
Source code in federatedml/param/feature_selection_param.py
class ManuallyFilterParam(BaseParam):
    """
    Specified columns that need to be filtered. If exist, it will be filtered directly, otherwise, ignore it.

    Both Filter_out or left parameters only works for this specific filter. For instances, if you set some columns left
    in this filter but those columns are filtered by other filters, those columns will NOT left in final.

    Please note that (left_col_indexes & left_col_names) cannot use with (filter_out_indexes & filter_out_names) simultaneously.

    Parameters
    ----------
    filter_out_indexes: list of int, default: None
        Specify columns' indexes to be filtered out

    filter_out_names : list of string, default: None
        Specify columns' names to be filtered out

    left_col_indexes: list of int, default: None
        Specify left_col_index

    left_col_names: list of string, default: None
        Specify left col names


    """

    def __init__(self, filter_out_indexes=None, filter_out_names=None, left_col_indexes=None,
                 left_col_names=None):
        super().__init__()
        self.filter_out_indexes = filter_out_indexes
        self.filter_out_names = filter_out_names
        self.left_col_indexes = left_col_indexes
        self.left_col_names = left_col_names

    def check(self):
        descr = "Manually Filter param's"
        self.check_defined_type(self.filter_out_indexes, descr, ['list', 'NoneType'])
        self.check_defined_type(self.filter_out_names, descr, ['list', 'NoneType'])
        self.check_defined_type(self.left_col_indexes, descr, ['list', 'NoneType'])
        self.check_defined_type(self.left_col_names, descr, ['list', 'NoneType'])

        if (self.filter_out_indexes or self.filter_out_names) is not None and \
                (self.left_col_names or self.left_col_indexes) is not None:
            raise ValueError("(left_col_indexes & left_col_names) cannot use with"
                             " (filter_out_indexes & filter_out_names) simultaneously")
        return True
__init__(self, filter_out_indexes=None, filter_out_names=None, left_col_indexes=None, left_col_names=None) special
Source code in federatedml/param/feature_selection_param.py
def __init__(self, filter_out_indexes=None, filter_out_names=None, left_col_indexes=None,
             left_col_names=None):
    super().__init__()
    self.filter_out_indexes = filter_out_indexes
    self.filter_out_names = filter_out_names
    self.left_col_indexes = left_col_indexes
    self.left_col_names = left_col_names
check(self)
Source code in federatedml/param/feature_selection_param.py
def check(self):
    descr = "Manually Filter param's"
    self.check_defined_type(self.filter_out_indexes, descr, ['list', 'NoneType'])
    self.check_defined_type(self.filter_out_names, descr, ['list', 'NoneType'])
    self.check_defined_type(self.left_col_indexes, descr, ['list', 'NoneType'])
    self.check_defined_type(self.left_col_names, descr, ['list', 'NoneType'])

    if (self.filter_out_indexes or self.filter_out_names) is not None and \
            (self.left_col_names or self.left_col_indexes) is not None:
        raise ValueError("(left_col_indexes & left_col_names) cannot use with"
                         " (filter_out_indexes & filter_out_names) simultaneously")
    return True
FeatureSelectionParam (BaseParam)

Define the feature selection parameters.

Parameters:

Name Type Description Default
select_col_indexes list or int, default: -1

Specify which columns need to calculated. -1 represent for all columns.

-1
select_names list of string, default: []

Specify which columns need to calculated. Each element in the list represent for a column name in header.

None
filter_methods list of ["manually", "iv_filter", "statistic_filter", "psi_filter", “hetero_sbt_filter", "homo_sbt_filter", "hetero_fast_sbt_filter", "percentage_value", "vif_filter", "correlation_filter"], default: ["manually"]

The following methods will be deprecated in future version: "unique_value", "iv_value_thres", "iv_percentile", "coefficient_of_variation_value_thres", "outlier_cols"

Specify the filter methods used in feature selection. The orders of filter used is depended on this list. Please be notified that, if a percentile method is used after some certain filter method, the percentile represent for the ratio of rest features.

e.g. If you have 10 features at the beginning. After first filter method, you have 8 rest. Then, you want top 80% highest iv feature. Here, we will choose floor(0.8 * 8) = 6 features instead of 8.

None
unique_param UniqueValueParam

filter the columns if all values in this feature is the same

<federatedml.param.feature_selection_param.UniqueValueParam object at 0x7f2755330c50>
iv_value_param IVValueSelectionParam

Use information value to filter columns. If this method is set, a float threshold need to be provided. Filter those columns whose iv is smaller than threshold. Will be deprecated in the future.

<federatedml.param.feature_selection_param.IVValueSelectionParam object at 0x7f2755330e10>
iv_percentile_param IVPercentileSelectionParam

Use information value to filter columns. If this method is set, a float ratio threshold need to be provided. Pick floor(ratio * feature_num) features with higher iv. If multiple features around the threshold are same, all those columns will be keep. Will be deprecated in the future.

<federatedml.param.feature_selection_param.IVPercentileSelectionParam object at 0x7f2755330910>
variance_coe_param VarianceOfCoeSelectionParam

Use coefficient of variation to judge whether filtered or not. Will be deprecated in the future.

<federatedml.param.feature_selection_param.VarianceOfCoeSelectionParam object at 0x7f2755330e90>
outlier_param OutlierColsSelectionParam

Filter columns whose certain percentile value is larger than a threshold. Will be deprecated in the future.

<federatedml.param.feature_selection_param.OutlierColsSelectionParam object at 0x7f2755330590>
percentage_value_param PercentageValueParam

Filter the columns that have a value that exceeds a certain percentage.

<federatedml.param.feature_selection_param.PercentageValueParam object at 0x7f2755330710>
iv_param IVFilterParam

Setting how to filter base on iv. It support take high mode only. All of "threshold", "top_k" and "top_percentile" are accepted. Check more details in CommonFilterParam. To use this filter, hetero-feature-binning module has to be provided.

<federatedml.param.feature_selection_param.IVFilterParam object at 0x7f27552bc690>
statistic_param CommonFilterParam

Setting how to filter base on statistic values. All of "threshold", "top_k" and "top_percentile" are accepted. Check more details in CommonFilterParam. To use this filter, data_statistic module has to be provided.

<federatedml.param.feature_selection_param.CommonFilterParam object at 0x7f27552bc810>
psi_param CommonFilterParam

Setting how to filter base on psi values. All of "threshold", "top_k" and "top_percentile" are accepted. Its take_high properties should be False to choose lower psi features. Check more details in CommonFilterParam. To use this filter, data_statistic module has to be provided.

<federatedml.param.feature_selection_param.CommonFilterParam object at 0x7f27552bc750>
need_run bool, default True

Indicate if this module needed to be run

True
Source code in federatedml/param/feature_selection_param.py
class FeatureSelectionParam(BaseParam):
    """
    Define the feature selection parameters.

    Parameters
    ----------
    select_col_indexes: list or int, default: -1
        Specify which columns need to calculated. -1 represent for all columns.

    select_names : list of string, default: []
        Specify which columns need to calculated. Each element in the list represent for a column name in header.

    filter_methods: list of ["manually", "iv_filter", "statistic_filter", "psi_filter", “hetero_sbt_filter", "homo_sbt_filter", "hetero_fast_sbt_filter", "percentage_value", "vif_filter", "correlation_filter"], default: ["manually"]
        The following methods will be deprecated in future version:
        "unique_value", "iv_value_thres", "iv_percentile",
        "coefficient_of_variation_value_thres", "outlier_cols"

        Specify the filter methods used in feature selection. The orders of filter used is depended on this list.
        Please be notified that, if a percentile method is used after some certain filter method,
        the percentile represent for the ratio of rest features.

        e.g. If you have 10 features at the beginning. After first filter method, you have 8 rest. Then, you want
        top 80% highest iv feature. Here, we will choose floor(0.8 * 8) = 6 features instead of 8.

    unique_param: UniqueValueParam
        filter the columns if all values in this feature is the same

    iv_value_param: IVValueSelectionParam
        Use information value to filter columns. If this method is set, a float threshold need to be provided.
        Filter those columns whose iv is smaller than threshold. Will be deprecated in the future.

    iv_percentile_param: IVPercentileSelectionParam
        Use information value to filter columns. If this method is set, a float ratio threshold
        need to be provided. Pick floor(ratio * feature_num) features with higher iv. If multiple features around
        the threshold are same, all those columns will be keep. Will be deprecated in the future.

    variance_coe_param: VarianceOfCoeSelectionParam
        Use coefficient of variation to judge whether filtered or not.
        Will be deprecated in the future.

    outlier_param: OutlierColsSelectionParam
        Filter columns whose certain percentile value is larger than a threshold.
        Will be deprecated in the future.

    percentage_value_param: PercentageValueParam
        Filter the columns that have a value that exceeds a certain percentage.

    iv_param: IVFilterParam
        Setting how to filter base on iv. It support take high mode only. All of "threshold",
        "top_k" and "top_percentile" are accepted. Check more details in CommonFilterParam. To
        use this filter, hetero-feature-binning module has to be provided.

    statistic_param: CommonFilterParam
        Setting how to filter base on statistic values. All of "threshold",
        "top_k" and "top_percentile" are accepted. Check more details in CommonFilterParam.
        To use this filter, data_statistic module has to be provided.

    psi_param: CommonFilterParam
        Setting how to filter base on psi values. All of "threshold",
        "top_k" and "top_percentile" are accepted. Its take_high properties should be False
        to choose lower psi features. Check more details in CommonFilterParam.
        To use this filter, data_statistic module has to be provided.

    need_run: bool, default True
        Indicate if this module needed to be run

    """

    def __init__(self, select_col_indexes=-1, select_names=None, filter_methods=None,
                 unique_param=UniqueValueParam(),
                 iv_value_param=IVValueSelectionParam(),
                 iv_percentile_param=IVPercentileSelectionParam(),
                 iv_top_k_param=IVTopKParam(),
                 variance_coe_param=VarianceOfCoeSelectionParam(),
                 outlier_param=OutlierColsSelectionParam(),
                 manually_param=ManuallyFilterParam(),
                 percentage_value_param=PercentageValueParam(),
                 iv_param=IVFilterParam(),
                 statistic_param=CommonFilterParam(metrics=consts.MEAN),
                 psi_param=CommonFilterParam(metrics=consts.PSI,
                                             take_high=False),
                 vif_param=CommonFilterParam(metrics=consts.VIF,
                                             threshold=5.0,
                                             take_high=False),
                 sbt_param=CommonFilterParam(metrics=consts.FEATURE_IMPORTANCE),
                 correlation_param=CorrelationFilterParam(),
                 need_run=True
                 ):
        super(FeatureSelectionParam, self).__init__()
        self.correlation_param = correlation_param
        self.vif_param = vif_param
        self.select_col_indexes = select_col_indexes
        if select_names is None:
            self.select_names = []
        else:
            self.select_names = select_names
        if filter_methods is None:
            self.filter_methods = [consts.MANUALLY_FILTER]
        else:
            self.filter_methods = filter_methods

        # deprecate in the future
        self.unique_param = copy.deepcopy(unique_param)
        self.iv_value_param = copy.deepcopy(iv_value_param)
        self.iv_percentile_param = copy.deepcopy(iv_percentile_param)
        self.iv_top_k_param = copy.deepcopy(iv_top_k_param)
        self.variance_coe_param = copy.deepcopy(variance_coe_param)
        self.outlier_param = copy.deepcopy(outlier_param)
        self.percentage_value_param = copy.deepcopy(percentage_value_param)

        self.manually_param = copy.deepcopy(manually_param)
        self.iv_param = copy.deepcopy(iv_param)
        self.statistic_param = copy.deepcopy(statistic_param)
        self.psi_param = copy.deepcopy(psi_param)
        self.sbt_param = copy.deepcopy(sbt_param)
        self.need_run = need_run

    def check(self):
        descr = "hetero feature selection param's"

        self.check_defined_type(self.filter_methods, descr, ['list'])

        for idx, method in enumerate(self.filter_methods):
            method = method.lower()
            self.check_valid_value(method, descr, [consts.UNIQUE_VALUE, consts.IV_VALUE_THRES, consts.IV_PERCENTILE,
                                                   consts.COEFFICIENT_OF_VARIATION_VALUE_THRES, consts.OUTLIER_COLS,
                                                   consts.MANUALLY_FILTER, consts.PERCENTAGE_VALUE,
                                                   consts.IV_FILTER, consts.STATISTIC_FILTER, consts.IV_TOP_K,
                                                   consts.PSI_FILTER, consts.HETERO_SBT_FILTER,
                                                   consts.HOMO_SBT_FILTER, consts.HETERO_FAST_SBT_FILTER,
                                                   consts.VIF_FILTER, consts.CORRELATION_FILTER])

            self.filter_methods[idx] = method

        self.check_defined_type(self.select_col_indexes, descr, ['list', 'int'])

        self.unique_param.check()
        self.iv_value_param.check()
        self.iv_percentile_param.check()
        self.iv_top_k_param.check()
        self.variance_coe_param.check()
        self.outlier_param.check()
        self.manually_param.check()
        self.percentage_value_param.check()

        self.iv_param.check()
        for th in self.iv_param.take_high:
            if not th:
                raise ValueError("Iv filter should take higher iv features")
        for m in self.iv_param.metrics:
            if m != consts.IV:
                raise ValueError("For iv filter, metrics should be 'iv'")

        self.statistic_param.check()
        self.psi_param.check()
        for th in self.psi_param.take_high:
            if th:
                raise ValueError("PSI filter should take lower psi features")
        for m in self.psi_param.metrics:
            if m != consts.PSI:
                raise ValueError("For psi filter, metrics should be 'psi'")

        self.sbt_param.check()
        for th in self.sbt_param.take_high:
            if not th:
                raise ValueError("SBT filter should take higher feature_importance features")
        for m in self.sbt_param.metrics:
            if m != consts.FEATURE_IMPORTANCE:
                raise ValueError("For SBT filter, metrics should be 'feature_importance'")

        self.vif_param.check()
        for m in self.vif_param.metrics:
            if m != consts.VIF:
                raise ValueError("For VIF filter, metrics should be 'vif'")

        self.correlation_param.check()

        self._warn_to_deprecate_param("iv_value_param", descr, "iv_param")
        self._warn_to_deprecate_param("iv_percentile_param", descr, "iv_param")
        self._warn_to_deprecate_param("iv_top_k_param", descr, "iv_param")
        self._warn_to_deprecate_param("variance_coe_param", descr, "statistic_param")
        self._warn_to_deprecate_param("unique_param", descr, "statistic_param")
        self._warn_to_deprecate_param("outlier_param", descr, "statistic_param")
__init__(self, select_col_indexes=-1, select_names=None, filter_methods=None, unique_param=<federatedml.param.feature_selection_param.UniqueValueParam object at 0x7f2755330c50>, iv_value_param=<federatedml.param.feature_selection_param.IVValueSelectionParam object at 0x7f2755330e10>, iv_percentile_param=<federatedml.param.feature_selection_param.IVPercentileSelectionParam object at 0x7f2755330910>, iv_top_k_param=<federatedml.param.feature_selection_param.IVTopKParam object at 0x7f2755330c90>, variance_coe_param=<federatedml.param.feature_selection_param.VarianceOfCoeSelectionParam object at 0x7f2755330e90>, outlier_param=<federatedml.param.feature_selection_param.OutlierColsSelectionParam object at 0x7f2755330590>, manually_param=<federatedml.param.feature_selection_param.ManuallyFilterParam object at 0x7f2755330f50>, percentage_value_param=<federatedml.param.feature_selection_param.PercentageValueParam object at 0x7f2755330710>, iv_param=<federatedml.param.feature_selection_param.IVFilterParam object at 0x7f27552bc690>, statistic_param=<federatedml.param.feature_selection_param.CommonFilterParam object at 0x7f27552bc810>, psi_param=<federatedml.param.feature_selection_param.CommonFilterParam object at 0x7f27552bc750>, vif_param=<federatedml.param.feature_selection_param.CommonFilterParam object at 0x7f27552bc790>, sbt_param=<federatedml.param.feature_selection_param.CommonFilterParam object at 0x7f27552bc890>, correlation_param=<federatedml.param.feature_selection_param.CorrelationFilterParam object at 0x7f27552bc8d0>, need_run=True) special
Source code in federatedml/param/feature_selection_param.py
def __init__(self, select_col_indexes=-1, select_names=None, filter_methods=None,
             unique_param=UniqueValueParam(),
             iv_value_param=IVValueSelectionParam(),
             iv_percentile_param=IVPercentileSelectionParam(),
             iv_top_k_param=IVTopKParam(),
             variance_coe_param=VarianceOfCoeSelectionParam(),
             outlier_param=OutlierColsSelectionParam(),
             manually_param=ManuallyFilterParam(),
             percentage_value_param=PercentageValueParam(),
             iv_param=IVFilterParam(),
             statistic_param=CommonFilterParam(metrics=consts.MEAN),
             psi_param=CommonFilterParam(metrics=consts.PSI,
                                         take_high=False),
             vif_param=CommonFilterParam(metrics=consts.VIF,
                                         threshold=5.0,
                                         take_high=False),
             sbt_param=CommonFilterParam(metrics=consts.FEATURE_IMPORTANCE),
             correlation_param=CorrelationFilterParam(),
             need_run=True
             ):
    super(FeatureSelectionParam, self).__init__()
    self.correlation_param = correlation_param
    self.vif_param = vif_param
    self.select_col_indexes = select_col_indexes
    if select_names is None:
        self.select_names = []
    else:
        self.select_names = select_names
    if filter_methods is None:
        self.filter_methods = [consts.MANUALLY_FILTER]
    else:
        self.filter_methods = filter_methods

    # deprecate in the future
    self.unique_param = copy.deepcopy(unique_param)
    self.iv_value_param = copy.deepcopy(iv_value_param)
    self.iv_percentile_param = copy.deepcopy(iv_percentile_param)
    self.iv_top_k_param = copy.deepcopy(iv_top_k_param)
    self.variance_coe_param = copy.deepcopy(variance_coe_param)
    self.outlier_param = copy.deepcopy(outlier_param)
    self.percentage_value_param = copy.deepcopy(percentage_value_param)

    self.manually_param = copy.deepcopy(manually_param)
    self.iv_param = copy.deepcopy(iv_param)
    self.statistic_param = copy.deepcopy(statistic_param)
    self.psi_param = copy.deepcopy(psi_param)
    self.sbt_param = copy.deepcopy(sbt_param)
    self.need_run = need_run
check(self)
Source code in federatedml/param/feature_selection_param.py
def check(self):
    descr = "hetero feature selection param's"

    self.check_defined_type(self.filter_methods, descr, ['list'])

    for idx, method in enumerate(self.filter_methods):
        method = method.lower()
        self.check_valid_value(method, descr, [consts.UNIQUE_VALUE, consts.IV_VALUE_THRES, consts.IV_PERCENTILE,
                                               consts.COEFFICIENT_OF_VARIATION_VALUE_THRES, consts.OUTLIER_COLS,
                                               consts.MANUALLY_FILTER, consts.PERCENTAGE_VALUE,
                                               consts.IV_FILTER, consts.STATISTIC_FILTER, consts.IV_TOP_K,
                                               consts.PSI_FILTER, consts.HETERO_SBT_FILTER,
                                               consts.HOMO_SBT_FILTER, consts.HETERO_FAST_SBT_FILTER,
                                               consts.VIF_FILTER, consts.CORRELATION_FILTER])

        self.filter_methods[idx] = method

    self.check_defined_type(self.select_col_indexes, descr, ['list', 'int'])

    self.unique_param.check()
    self.iv_value_param.check()
    self.iv_percentile_param.check()
    self.iv_top_k_param.check()
    self.variance_coe_param.check()
    self.outlier_param.check()
    self.manually_param.check()
    self.percentage_value_param.check()

    self.iv_param.check()
    for th in self.iv_param.take_high:
        if not th:
            raise ValueError("Iv filter should take higher iv features")
    for m in self.iv_param.metrics:
        if m != consts.IV:
            raise ValueError("For iv filter, metrics should be 'iv'")

    self.statistic_param.check()
    self.psi_param.check()
    for th in self.psi_param.take_high:
        if th:
            raise ValueError("PSI filter should take lower psi features")
    for m in self.psi_param.metrics:
        if m != consts.PSI:
            raise ValueError("For psi filter, metrics should be 'psi'")

    self.sbt_param.check()
    for th in self.sbt_param.take_high:
        if not th:
            raise ValueError("SBT filter should take higher feature_importance features")
    for m in self.sbt_param.metrics:
        if m != consts.FEATURE_IMPORTANCE:
            raise ValueError("For SBT filter, metrics should be 'feature_importance'")

    self.vif_param.check()
    for m in self.vif_param.metrics:
        if m != consts.VIF:
            raise ValueError("For VIF filter, metrics should be 'vif'")

    self.correlation_param.check()

    self._warn_to_deprecate_param("iv_value_param", descr, "iv_param")
    self._warn_to_deprecate_param("iv_percentile_param", descr, "iv_param")
    self._warn_to_deprecate_param("iv_top_k_param", descr, "iv_param")
    self._warn_to_deprecate_param("variance_coe_param", descr, "statistic_param")
    self._warn_to_deprecate_param("unique_param", descr, "statistic_param")
    self._warn_to_deprecate_param("outlier_param", descr, "statistic_param")
feldman_verifiable_sum_param
Classes
FeldmanVerifiableSumParam (BaseParam)

Define how to transfer the cols

Parameters:

Name Type Description Default
sum_cols list of column index, default: None

Specify which columns need to be sum. If column index is None, each of columns will be sum.

None
q_n int, positive integer less than or equal to 16, default: 6

q_n is the number of significant decimal digit, If the data type is a float, the maximum significant digit is 16. The sum of integer and significant decimal digits should be less than or equal to 16.

6
Source code in federatedml/param/feldman_verifiable_sum_param.py
class FeldmanVerifiableSumParam(BaseParam):
    """
    Define how to transfer the cols

    Parameters
    ----------
    sum_cols : list of column index, default: None
        Specify which columns need to be sum. If column index is None, each of columns will be sum.

    q_n : int, positive integer less than or equal to 16, default: 6
        q_n is the number of significant decimal digit, If the data type is a float, 
        the maximum significant digit is 16. The sum of integer and significant decimal digits should 
        be less than or equal to 16.
    """
    def __init__(self, sum_cols=None, q_n=6):
        self.sum_cols = sum_cols
        if sum_cols is None:
            self.sum_cols = []

        self.q_n = q_n

    def check(self):
        if isinstance(self.sum_cols, list):
            for idx in self.sum_cols:
                if not isinstance(idx, int):
                    raise ValueError(f"type mismatch, column_indexes with element {idx}(type is {type(idx)})")

        if not isinstance(self.q_n, int):
            raise ValueError(f"Init param's q_n {self.q_n} not supported, should be int type", type is {type(self.q_n)})

        if self.q_n < 0:
            raise ValueError(f"param's q_n {self.q_n} not supported, should be non-negative int value")
        elif self.q_n > 16:
            raise ValueError(f"param's q_n {self.q_n} not supported, should be less than or equal to 16")
__init__(self, sum_cols=None, q_n=6) special
Source code in federatedml/param/feldman_verifiable_sum_param.py
def __init__(self, sum_cols=None, q_n=6):
    self.sum_cols = sum_cols
    if sum_cols is None:
        self.sum_cols = []

    self.q_n = q_n
check(self)
Source code in federatedml/param/feldman_verifiable_sum_param.py
def check(self):
    if isinstance(self.sum_cols, list):
        for idx in self.sum_cols:
            if not isinstance(idx, int):
                raise ValueError(f"type mismatch, column_indexes with element {idx}(type is {type(idx)})")

    if not isinstance(self.q_n, int):
        raise ValueError(f"Init param's q_n {self.q_n} not supported, should be int type", type is {type(self.q_n)})

    if self.q_n < 0:
        raise ValueError(f"param's q_n {self.q_n} not supported, should be non-negative int value")
    elif self.q_n > 16:
        raise ValueError(f"param's q_n {self.q_n} not supported, should be less than or equal to 16")
ftl_param
deprecated_param_list
Classes
FTLParam (BaseParam)
Source code in federatedml/param/ftl_param.py
class FTLParam(BaseParam):

    def __init__(self, alpha=1, tol=0.000001,
                 n_iter_no_change=False, validation_freqs=None, optimizer={'optimizer': 'Adam', 'learning_rate': 0.01},
                 nn_define={}, epochs=1
                 , intersect_param=IntersectParam(consts.RSA), config_type='keras', batch_size=-1,
                 encrypte_param=EncryptParam(),
                 encrypted_mode_calculator_param=EncryptedModeCalculatorParam(mode="confusion_opt"),
                 predict_param=PredictParam(), mode='plain', communication_efficient=False,
                 local_round=5, callback_param=CallbackParam()):
        """
        Parameters
        ----------
        alpha : float
            a loss coefficient defined in paper, it defines the importance of alignment loss
        tol : float
            loss tolerance
        n_iter_no_change : bool
            check loss convergence or not
        validation_freqs : None or positive integer or container object in python
            Do validation in training process or Not.
            if equals None, will not do validation in train process;
            if equals positive integer, will validate data every validation_freqs epochs passes;
            if container object in python, will validate data if epochs belong to this container.
            e.g. validation_freqs = [10, 15], will validate data when epoch equals to 10 and 15.
            The default value is None, 1 is suggested. You can set it to a number larger than 1 in order to
            speed up training by skipping validation rounds. When it is larger than 1, a number which is
            divisible by "epochs" is recommended, otherwise, you will miss the validation scores
            of last training epoch.
        optimizer : str or dict
            optimizer method, accept following types:
            1. a string, one of "Adadelta", "Adagrad", "Adam", "Adamax", "Nadam", "RMSprop", "SGD"
            2. a dict, with a required key-value pair keyed by "optimizer",
                with optional key-value pairs such as learning rate.
            defaults to "SGD"
        nn_define : dict
            a dict represents the structure of neural network, it can be output by tf-keras
        epochs : int
            epochs num
        intersect_param
            define the intersect method
        config_type : {'tf-keras'}
            config type
        batch_size : int
            batch size when computing transformed feature embedding, -1 use full data.
        encrypte_param
            encrypted param
        encrypted_mode_calculator_param
            encrypted mode calculator param:
        predict_param
            predict param
        mode: {"plain", "encrypted"}
            plain: will not use any encrypt algorithms, data exchanged in plaintext
            encrypted: use paillier to encrypt gradients
        communication_efficient: bool
            will use communication efficient or not. when communication efficient is enabled, FTL model will
            update gradients by several local rounds using intermediate data
        local_round: int
            local update round when using communication efficient
        """

        super(FTLParam, self).__init__()
        self.alpha = alpha
        self.tol = tol
        self.n_iter_no_change = n_iter_no_change
        self.validation_freqs = validation_freqs
        self.optimizer = optimizer
        self.nn_define = nn_define
        self.epochs = epochs
        self.intersect_param = copy.deepcopy(intersect_param)
        self.config_type = config_type
        self.batch_size = batch_size
        self.encrypted_mode_calculator_param = copy.deepcopy(encrypted_mode_calculator_param)
        self.encrypt_param = copy.deepcopy(encrypte_param)
        self.predict_param = copy.deepcopy(predict_param)
        self.mode = mode
        self.communication_efficient = communication_efficient
        self.local_round = local_round
        self.callback_param = copy.deepcopy(callback_param)

    def check(self):
        self.intersect_param.check()
        self.encrypt_param.check()
        self.encrypted_mode_calculator_param.check()

        self.optimizer = self._parse_optimizer(self.optimizer)

        supported_config_type = ["keras"]
        if self.config_type not in supported_config_type:
            raise ValueError(f"config_type should be one of {supported_config_type}")

        if not isinstance(self.tol, (int, float)):
            raise ValueError("tol should be numeric")

        if not isinstance(self.epochs, int) or self.epochs <= 0:
            raise ValueError("epochs should be a positive integer")

        if self.nn_define and not isinstance(self.nn_define, dict):
            raise ValueError("bottom_nn_define should be a dict defining the structure of neural network")

        if self.batch_size != -1:
            if not isinstance(self.batch_size, int) \
                    or self.batch_size < consts.MIN_BATCH_SIZE:
                raise ValueError(
                    " {} not supported, should be larger than 10 or -1 represent for all data".format(self.batch_size))

        for p in deprecated_param_list:
            # if self._warn_to_deprecate_param(p, "", ""):
            if self._deprecated_params_set.get(p):
                if "callback_param" in self.get_user_feeded():
                    raise ValueError(f"{p} and callback param should not be set simultaneously,"
                                     f"{self._deprecated_params_set}, {self.get_user_feeded()}")
                else:
                    self.callback_param.callbacks = ["PerformanceEvaluate"]
                break

        descr = "ftl's"

        if self._warn_to_deprecate_param("validation_freqs", descr, "callback_param's 'validation_freqs'"):
            self.callback_param.validation_freqs = self.validation_freqs

        if self._warn_to_deprecate_param("metrics", descr, "callback_param's 'metrics'"):
            self.callback_param.metrics = self.metrics

        if self.validation_freqs is None:
            pass
        elif isinstance(self.validation_freqs, int):
            if self.validation_freqs < 1:
                raise ValueError("validation_freqs should be larger than 0 when it's integer")
        elif not isinstance(self.validation_freqs, collections.Container):
            raise ValueError("validation_freqs should be None or positive integer or container")

        assert type(self.communication_efficient) is bool, 'communication efficient must be a boolean'
        assert self.mode in ['encrypted', 'plain'], 'mode options: encrpyted or plain, but {} is offered'.format(self.mode)

        self.check_positive_integer(self.epochs, 'epochs')
        self.check_positive_number(self.alpha, 'alpha')
        self.check_positive_integer(self.local_round, 'local round')

    @staticmethod
    def _parse_optimizer(opt):
        """
        Examples:

            1. "optimize": "SGD"
            2. "optimize": {
                "optimizer": "SGD",
                "learning_rate": 0.05
            }
        """

        kwargs = {}
        if isinstance(opt, str):
            return SimpleNamespace(optimizer=opt, kwargs=kwargs)
        elif isinstance(opt, dict):
            optimizer = opt.get("optimizer", kwargs)
            if not optimizer:
                raise ValueError(f"optimizer config: {opt} invalid")
            kwargs = {k: v for k, v in opt.items() if k != "optimizer"}
            return SimpleNamespace(optimizer=optimizer, kwargs=kwargs)
        else:
            raise ValueError(f"invalid type for optimize: {type(opt)}")
Methods
__init__(self, alpha=1, t