Skip to content
This repository has been archived by the owner on Jul 27, 2023. It is now read-only.

Fix distributive upgrade from Mantl 1.0.3 -> 1.1 #1296

Merged
merged 4 commits into from
Apr 7, 2016

Conversation

langston-barrett
Copy link
Contributor

Waiting on mantl/mantl-packaging#67

  • Installs cleanly on a fresh build of most recent master branch
  • Upgrades cleanly from the most recent release
  • Updates documentation relevant to the changes

Tested on AWS

@langston-barrett langston-barrett force-pushed the fix/distributive-upgrade branch 2 times, most recently from 7e471a7 to 068fad7 Compare March 27, 2016 22:21
with_items:
- asteris-mantl-rpm.repo
- ciscocloud-rpm.repo

# BASICS - we need every node in the cluster to have common software running to
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm just wondering if this is the best place for this. A fresh install failed for me b/c I hadn't update my local mantl.yml with this task from sample.yml. I'm not sure how widespread of an issue this will be since it won't affect new users but it might bite more of us who have existing Mantl configs. And, in general, keeping the sample.yml as simple as possible seems like a good thing. Do you think it might be cleaner to just put this in a tiny role that is included as a dependency in common?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm glad to get feedback on this, since I wasn't really sure where to put it. I don't have any kind of a preference. I think that we should be able to expect that folks will update their mantl.yml with new stuff in sample.yml every release (thinking about past releases when we have added/removed roles, etc.), but I'm also just generally unsure of where this belongs.

I like the idea of using Ansible role dependencies more throughout our project so we don't have to depend on, say, the ordering of tasks in sample.yml. If we create a new role, maybe we can add it as a dependency for any role that uses our repos.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think this belongs in the playbook. We should include roles and depend on them, sure, but this pins us to supporting this for longer than in a role. (Because it's encouraged to update your sample.yml with customizations.)

Maybe this should be a "repos" role that common et al. depend on.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1, I think this will be a little cleaner. @siddharthist do you mind moving this to a "repos" role?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not at all, will do ASAP.

@ryane
Copy link
Contributor

ryane commented Apr 1, 2016

Successful upgrade from 1.0.3! Just had one question about the sample.yml.

@langston-barrett
Copy link
Contributor Author

@ryane @BrianHicks Done, tested upgrade from 1.0.3 -> 1.1 on AWS. Travis will check fresh install.

@langston-barrett
Copy link
Contributor Author

I'm a little confused, Travis is failing on a task (common | enable yum repos) that isn't present in this branch. I'm going to rebase and try again.

$ git status                      
On branch fix/distributive-upgrade
Changes not staged for commit:
  (use "git add <file>..." to update what will be committed)
  (use "git checkout -- <file>..." to discard changes in working directory)

    modified:   plugins/inventory/terraform.py

Untracked files:
  (use "git add <file>..." to include in what will be committed)

    aws.tf

no changes added to commit (use "git add" and/or "git commit -a")
$ grep -ri 'enable yum repos' roles/common
$

@langston-barrett langston-barrett force-pushed the fix/distributive-upgrade branch from 9fe3d57 to c15a32e Compare April 5, 2016 20:31
@langston-barrett
Copy link
Contributor Author

Well, after rebasing I found the problem, but Travis isn't testing. I'll rebase again, I suppose.

@langston-barrett langston-barrett force-pushed the fix/distributive-upgrade branch from 3ebf803 to 3e5a8b3 Compare April 6, 2016 05:28
@ryane
Copy link
Contributor

ryane commented Apr 6, 2016

is this ready for a manual test? it looks like the aws build failed due to a transient network error but the other two builds completed successfully.

@langston-barrett
Copy link
Contributor Author

@ryane Yes, it's ready for manual testing. Just an upgrade from 1.0.3 -> master on another cloud would be perfect.

@ryane
Copy link
Contributor

ryane commented Apr 6, 2016

cool, I'll kick of an upgrade on gce

@langston-barrett langston-barrett mentioned this pull request Apr 6, 2016
3 tasks
@ryane
Copy link
Contributor

ryane commented Apr 7, 2016

successful upgrade from 1.0.3 on GCE

@ryane ryane merged commit 6903175 into master Apr 7, 2016
@ryane ryane deleted the fix/distributive-upgrade branch April 7, 2016 01:14
@ryane ryane modified the milestone: 1.1 May 10, 2016
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants