-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Integration level: thinnest/lightest, vs. tests & more #8
Comments
This is a big question mark to me... If we'd like to push harder to get the integration on the official repo I see it very difficult to get maintainers accept a number of new files to be put inside their repository (and as you know @dandv, this already happened) ...even more difficult would be getting them accepting changes to the initialization closure (as [described](http://www.meteorpedia.com/read/Packaging_existing_Libraries#Dealing with exports) on MeteorPedia), even if this arleady happened too. So this is my feeling: if we say all we need to keep packages up to date is a good triggering mechanisms (to get notified about both new releases and publishing errors...) I'm quite sure the right direction is leveraging autopublish.meteor.com and webhooks! Once we've got a webhook from the official repository for a particular library we can start playing on MeteorPackaging without any constraint. What I'm thinking in these last hours about a wrapping process is something like this:
Then we can configure autopublish.meteor.com to trigger an
No forks, no submodules, complete control on the repo without any kind of concerns! The only constraint would be having file paths inside api.addFiles([
'upstream/dist/css/bootstrap.css',
'upstream/dist/js/bootstrap.js',
], where) expecting that after the second What do you think about this proposal? |
Interesting. CC @paralin. The file paths to addFiles isn't much of a constraint - does it affect development and testing of the integration before the build? Don't think so.
Do we need a bash script? Can't we just pick up the version from package.json with package.js code ? |
Just quick input - sorry if some seems off topic :) I think we could get more done if we had a general node package that did one thing: convert bower/package.json etc. into a package.js file - we might have to rig a simple analyser to fetch addition of globals via window etc. We can in most cases get the exports and where the code run server/client. The github hook could listen for semver tags, then convert the external package into a meteor package using the generated package.js Then have the server login in as bower etc. and publish the new package. Question is - what stops us from monitoring the bower db for versions or check tags on repos? // Packages should be added manually, as they are requested - making sure only quality packages are added Some of these package configs support testing - but we might just want to do an environment test - eg. testing the exports. We could perhaps add some additional tests when adding the package eg. have a field where one could write tinytests |
We could @dandv, but with that solution the package won't work as a local package! When you run If differently you clone the package repo under your project's |
@raix's idea of monitoring bower's database is really interesting. Can we skip the request to the 3rd party library maintainer to create the webhook or authorize autopublish, @splendido ? |
Alternatively, we could monitor the npm registry. Need to look into that, but here's how to get the latest version of a package: http://registry.npmjs.org/summernote/latest BTW, the above can be used as a fallback for when reading the version from Note: there is an effort underway to fold bower into npm. |
An interesting radical approach. Playing devil's advocate against it:
If we eliminate the need for package.js (auto-generate it as @raix suggested; or update it from the Bower/Npm registry) and the README patch is the only thing we submit in the PR, chances of acceptance increase even more. |
So why don't we periodically ask for the latest release using the github API? Would be a periodic poll in any case (every 24h?)... This might work well for 3rd party libraries, but if we want to open the service to regular Meteor Developer, the hook might be a better solution. Even if, in principle, keeping a wrapper repo wouldn't be needed following the approach proposed by @raix, I appreciate it will be useful to keep past history, plus dedicated tests, plus custom README.md, etc... So a mixture of the two might be the way to go? |
I'm not sure its a good idea to wrap the whole bower? (rather have people request it - could be automated, eg. have a page "request bower package") The github api got call limits? it's less work to diff the bower json/list? (I have nothing to back this up just thoughts) But yeah, might still want to have some options for adding stuff manually, like tests etc. if not able to convert. |
@raix, yes GitHub has some rate limits but it's not going to be a problem I guess (at least for now, but probably also later on...) Having to poll/diff something (every 4 hours?), what do you guys think would have the larger amount of packages? I'd say looking at new release on github would be enough, and we're going to get libraries both published on npm/bower and unpublished elsewhere. The other way around: how many packages that can be found on npm/bower are not available as GitHub repositories? I really have no idea about this... please shade some light if youhave some feelings about this! @raix proposal to have a page for package requests is a good one! ✅ poll release number (simply one more entry to be added to the db and considered on timeout interval..) |
I think we're over designing and under building. The right answers for our questions could come only from usage. I mean 4 hours, if we update once a week people would be happy, the libraries are currently lagging months behind the latest version. Why don't we just try with minimal viable solution, just pulling from npm package.json and without negotiating with library authors for start. If it doesn't work there's always place to change the approach later. @splendido @raix Correct me if I'm wrong but I hope packages won't be added automatically, just updated or we'll got flood of low quality crap. Like I said problem is social rather then technical, by leveraging the people who need the up to date library vs those who should provide it, we'll have a solution that scales. Regarding the organization, I believe that we should publish all automatic packages under one organization, organizations as d3js, nvd3 doesn't make much sense unless the author is behind them. Plus we'll have added brand benefit of up to date packages. |
Yep, we should make it a manual process adding packages (could be from npm/bower/github) - to ensure only quality packages.
The system could do a simple check for exports to see if working correctly - maybe have a way to report back to the system if it fails. Ideally imho - we should have a review team - paid - and a high quality version of atmosphere with optional paid packages to ensure quality and development. (if not we will end up having a jungle of packages) |
It seems that @splendido and I have been favoring different types of integrations: as light as possible, in hope original authors accept the PR; vs. with tests, favoring a solid solution.
That's fine, but now we have different packaging guidelines at https://github.com/MeteorPackaging/autopublish.meteor.com/wiki#step-by-step-guide vs. MeteorCommunity/discussions#14 -> Get the integration working
@splendido, you agreed we should use exports, so maybe we can just copy a chunk of the issue 14/OT guidelines into the Autopublish wiki. I've made some edits there but for now, I've left your light integration steps in place.
Also, just saw your comment on nvd3.
The text was updated successfully, but these errors were encountered: