Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug with zero TS #1148

Closed
aPovidlo opened this issue Aug 18, 2023 · 1 comment · Fixed by #1153
Closed

Bug with zero TS #1148

aPovidlo opened this issue Aug 18, 2023 · 1 comment · Fixed by #1153
Assignees
Labels
bug Something isn't working

Comments

@aPovidlo
Copy link
Collaborator

Fedot crushing after fitting it with ts full of zero values.

Code to reproduce:

horizon = 10
task_params = TsForecastingParams(forecast_length=horizon)
task = Task(TaskTypesEnum.ts_forecasting, task_params)

input_data = InputData(
    idx=np.arange(0, 50),
    features=np.zeros(50),
    target=np.zeros(50),
    task=task,
    data_type=DataTypesEnum.ts
)

auto_model = Fedot(
    problem='ts_forecasting',
    task_params=task_params,
    timeout=4,
    n_jobs=-1,
    seed=15
)

pipeline = auto_model.fit(input_data)

Traceback^

Generations:   0%|          | 1/10000 [02:24<?, ?gen/s]
Traceback (most recent call last):
  File "C:\Users\andre\AppData\Local\Programs\Python\Python39\lib\contextlib.py", line 135, in __exit__
    self.gen.throw(type, value, traceback)
  File "C:\Users\andre\Documents\GitHub\FEDOT\fedot\api\time.py", line 58, in launch_composing
    yield
  File "C:\Users\andre\Documents\GitHub\FEDOT\fedot\api\api_utils\api_composer.py", line 130, in compose_pipeline
    best_pipelines = gp_composer.compose_pipeline(data=train_data)
  File "C:\Users\andre\Documents\GitHub\FEDOT\fedot\core\composer\gp_composer\gp_composer.py", line 67, in compose_pipeline
    opt_result = self.optimizer.optimise(objective_function)
  File "C:\Users\andre\Documents\GitHub\FEDOT\venv\lib\site-packages\golem\core\optimisers\populational_optimizer.py", line 98, in optimise
    new_population = self._evolve_population(evaluator)
  File "C:\Users\andre\Documents\GitHub\FEDOT\venv\lib\site-packages\golem\core\optimisers\genetic\gp_optimizer.py", line 112, in _evolve_population
    new_population = self.reproducer.reproduce(individuals_to_select, evaluator)
  File "C:\Users\andre\Documents\GitHub\FEDOT\venv\lib\site-packages\golem\core\optimisers\genetic\operators\reproduction.py", line 106, in reproduce
    collected_next_population.update({ind.uid: ind for ind in partial_next_population})
TypeError: 'NoneType' object is not iterable

Sometimes, it cause Exception

Exception                                 Traceback (most recent call last)
Cell In[30], line 8
      4 locs.remove(base_location)
      6 print(f'Base TS was choosen randomly: {base_location}')
----> 8 model, pipeline, forecast = fit_predict_model(base_location, plots=True)
     10 test['emission'][test['location'] == base_location] = forecast
     12 for loc in tqdm(locs):

Cell In[28], line 12, in fit_predict_model(loc_name, plots)
      2 ts_train, ts_test, train_data, _, task_parameters = prepare_data(loc_name)
      4 auto_model = Fedot(
      5     problem='ts_forecasting',
      6     task_params=task_parameters,
   (...)
      9     seed=15,
     10 )
---> 12 pipeline = auto_model.fit(features=train_data)
     14 pipeline.show(node_size_scale=0.5, dpi=100)
     16 if plots:

File /opt/conda/lib/python3.10/site-packages/fedot/api/main.py:247, in Fedot.fit(self, features, target, predefined_model)
    243     self.current_pipeline = PredefinedModel(predefined_model, self.train_data, self.log,
    244                                             use_input_preprocessing=self.params.get(
    245                                                 'use_input_preprocessing')).fit()
    246 else:
--> 247     self.current_pipeline, self.best_models, self.history = self.api_composer.obtain_model(self.train_data)
    249     if self.current_pipeline is None:
    250         raise ValueError('No models were found')

File /opt/conda/lib/python3.10/site-packages/fedot/api/api_utils/api_composer.py:69, in ApiComposer.obtain_model(self, train_data)
     62 self.params.init_params_for_composing(self.timer.timedelta_composing, multi_objective)
     64 self.log.message(f"AutoML configured."
     65                  f" Parameters tuning: {with_tuning}."
     66                  f" Time limit: {timeout} min."
     67                  f" Set of candidate models: {self.params.get('available_operations')}.")
---> 69 best_pipeline, best_pipeline_candidates, gp_composer = self.compose_pipeline(
     70     train_data,
     71     initial_assumption,
     72     fitted_assumption
     73 )
     74 if with_tuning:
     75     best_pipeline = self.tune_final_pipeline(train_data, best_pipeline)

File /opt/conda/lib/python3.10/site-packages/fedot/api/api_utils/api_composer.py:130, in ApiComposer.compose_pipeline(self, train_data, initial_assumption, fitted_assumption)
    128 self.log.message('Pipeline composition started.')
    129 self.was_optimised = False
--> 130 best_pipelines = gp_composer.compose_pipeline(data=train_data)
    131 best_pipeline_candidates = gp_composer.best_models
    132 self.was_optimised = True

File /opt/conda/lib/python3.10/site-packages/fedot/core/composer/gp_composer/gp_composer.py:67, in GPComposer.compose_pipeline(self, data)
     64     self.optimizer.set_evaluation_callback(objective_evaluator.evaluate_intermediate_metrics)
     66 # Finally, run optimization process
---> 67 opt_result = self.optimizer.optimise(objective_function)
     69 best_model, self.best_models = self._convert_opt_results_to_pipeline(opt_result)
     70 self.log.info('GP composition finished')

File /opt/conda/lib/python3.10/site-packages/golem/core/optimisers/populational_optimizer.py:98, in PopulationalOptimizer.optimise(self, objective)
     96 while not self.stop_optimization():
     97     try:
---> 98         new_population = self._evolve_population(evaluator)
     99         if self.gen_structural_diversity_check != -1 \
    100                 and self.generations.generation_num % self.gen_structural_diversity_check == 0 \
    101                 and self.generations.generation_num != 0:
    102             new_population = self.get_structure_unique_population(new_population, evaluator)

File /opt/conda/lib/python3.10/site-packages/golem/core/optimisers/genetic/gp_optimizer.py:112, in EvoGraphOptimizer._evolve_population(self, evaluator)
    110 individuals_to_select = self.regularization(self.population, evaluator)
    111 # Reproduce from previous pop to get next population
--> 112 new_population = self.reproducer.reproduce(individuals_to_select, evaluator)
    114 # Adaptive agent experience collection & learning
    115 # Must be called after reproduction (that collects the new experience)
    116 experience = self.mutation.agent_experience

File /opt/conda/lib/python3.10/site-packages/golem/core/optimisers/genetic/operators/reproduction.py:104, in ReproductionController.reproduce(self, population, evaluator)
    101 residual_size = min(len(population), residual_size)
    103 # Reproduce the required number of individuals that equals residual size
--> 104 partial_next_population = self.reproduce_uncontrolled(population, evaluator, residual_size)
    105 # Avoid duplicate individuals that can come unchanged from previous population
    106 collected_next_population.update({ind.uid: ind for ind in partial_next_population})

File /opt/conda/lib/python3.10/site-packages/golem/core/optimisers/genetic/operators/reproduction.py:82, in ReproductionController.reproduce_uncontrolled(self, population, evaluator, pop_size)
     80 new_population = self.crossover(selected_individuals)
     81 new_population = self.mutation(new_population)
---> 82 new_population = evaluator(new_population)
     83 return new_population

File /opt/conda/lib/python3.10/site-packages/golem/core/optimisers/genetic/evaluation.py:199, in BaseGraphEvaluationDispatcher.evaluate_with_cache(self, population)
    197 reversed_population = list(reversed(population))
    198 self._remote_compute_cache(reversed_population)
--> 199 evaluated_population = self.evaluate_population(reversed_population)
    200 self._reset_eval_cache()
    201 return evaluated_population

File /opt/conda/lib/python3.10/site-packages/golem/core/optimisers/genetic/evaluation.py:249, in MultiprocessingDispatcher.evaluate_population(self, individuals)
    247 parallel = Parallel(n_jobs=n_jobs, verbose=0, pre_dispatch="2*n_jobs")
    248 eval_func = partial(self.evaluate_single, logs_initializer=Log().get_parameters())
--> 249 evaluation_results = parallel(delayed(eval_func)(ind.graph, ind.uid) for ind in individuals_to_evaluate)
    250 individuals_evaluated = self.apply_evaluation_results(individuals_to_evaluate, evaluation_results)
    251 # If there were no successful evals then try once again getting at least one,
    252 # even if time limit was reached

File /opt/conda/lib/python3.10/site-packages/joblib/parallel.py:1098, in Parallel.__call__(self, iterable)
   1095     self._iterating = False
   1097 with self._backend.retrieval_context():
-> 1098     self.retrieve()
   1099 # Make sure that we get a last message telling us we are done
   1100 elapsed_time = time.time() - self._start_time

File /opt/conda/lib/python3.10/site-packages/joblib/parallel.py:975, in Parallel.retrieve(self)
    973 try:
    974     if getattr(self._backend, 'supports_timeout', False):
--> 975         self._output.extend(job.get(timeout=self.timeout))
    976     else:
    977         self._output.extend(job.get())

File /opt/conda/lib/python3.10/site-packages/joblib/_parallel_backends.py:567, in LokyBackend.wrap_future_result(future, timeout)
    564 """Wrapper for Future.result to implement the same behaviour as
    565 AsyncResults.get from multiprocessing."""
    566 try:
--> 567     return future.result(timeout=timeout)
    568 except CfTimeoutError as e:
    569     raise TimeoutError from e

File /opt/conda/lib/python3.10/concurrent/futures/_base.py:458, in Future.result(self, timeout)
    456     raise CancelledError()
    457 elif self._state == FINISHED:
--> 458     return self.__get_result()
    459 else:
    460     raise TimeoutError()

File /opt/conda/lib/python3.10/concurrent/futures/_base.py:403, in Future.__get_result(self)
    401 if self._exception:
    402     try:
--> 403         raise self._exception
    404     finally:
    405         # Break a reference cycle with the exception in self._exception
    406         self = None

Exception: Invalid fitness after objective evaluation. Skipping the graph: (/n_lagged_{'window_size': 10};)/n_cgru_{'hidden_size': 200, 'learning_rate': 0.001, 'cnn1_kernel_size': 5, 'cnn1_output_size': 32, 'cnn2_kernel_size': 5, 'cnn2_output_size': 64, 'batch_size': 64, 'num_epochs': 50, 'optimizer': 'adamw', 'loss': 'mse'}
@aPovidlo aPovidlo added the bug Something isn't working label Aug 18, 2023
@kasyanovse kasyanovse linked a pull request Aug 22, 2023 that will close this issue
@kasyanovse kasyanovse removed a link to a pull request Aug 22, 2023
@kasyanovse kasyanovse linked a pull request Aug 22, 2023 that will close this issue
@nicl-nno
Copy link
Collaborator

#1095 может быть связано с той же проблемой.

@kasyanovse kasyanovse mentioned this issue Aug 23, 2023
kasyanovse added a commit that referenced this issue Sep 3, 2023
1. Fix #1148 with fixed denominator in CGRU and add test for new code
2. Fix #1151 with set n_jobs=1 for some operations
3. Add initial assumption with AR (#1074), enable AR (#1137)
4. Check and add test in accordance with #739
5. Fix integration test `test_result_changing`
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants