Apache Hadoop Jobs

Filter

My recent searches
Filter by:
Budget
to
to
to
Type
Skills
Languages
    Job State
    2 jobs found, pricing in USD

    Below you have three metrics that represent a sample of job performance (KPIs) over cloud where each job is executed on a number of nodes. The goal is to extract all Time series features (feature engineering ) of all the nodes for each job using only "Pyspark(Dataframe API)" from the three metrics to one dataframe and defining each feature used in your code (more Details is appreciat...

    $15 (Avg Bid)
    $15
    1 entries
    time series anomaly detection 2 days left
    VERIFIED

    the goal of my work is to identify anomaly jobs runing in cloud using job performance KPIs I have A huge number of jobs over the cloud and each job where executed on a number of nodes and each node is monitored by a large number of KPIs so each job KPI(ex: CPU usage ) represents a time series of all nodes CPU usage used at the time the job was executed **************** using Pyspark **********...

    $164 (Avg Bid)
    $164 Avg Bid
    15 bids

    Top Apache Hadoop Community Articles