diff --git a/docs/LICENSE.txt b/docs/LICENSE.txt
new file mode 100644
index 0000000..bd6cece
--- /dev/null
+++ b/docs/LICENSE.txt
@@ -0,0 +1,31 @@
+Copyright 2011 Disney Enterprises, Inc. All rights reserved
+
+Redistribution and use in source and binary forms, with or without
+modification, are permitted provided that the following conditions are
+met:
+
+* Redistributions of source code must retain the above copyright
+notice, this list of conditions and the following disclaimer.
+
+* Redistributions in binary form must reproduce the above copyright
+notice, this list of conditions and the following disclaimer in
+the documentation and/or other materials provided with the
+distribution.
+
+* The names "Disney", "Walt Disney Pictures", "Walt Disney Animation
+Studios" or the names of its contributors may NOT be used to
+endorse or promote products derived from this software without
+specific prior written permission from Walt Disney Pictures.
+
+Disclaimer: THIS SOFTWARE IS PROVIDED BY WALT DISNEY PICTURES AND
+CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING,
+BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY, FITNESS
+FOR A PARTICULAR PURPOSE, NONINFRINGEMENT AND TITLE ARE DISCLAIMED.
+IN NO EVENT SHALL WALT DISNEY PICTURES, THE COPYRIGHT HOLDER OR
+CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL,
+EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO,
+PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
+PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND BASED ON ANY
+THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGES.
diff --git a/docs/about.txt b/docs/about.txt
new file mode 100644
index 0000000..eb74904
--- /dev/null
+++ b/docs/about.txt
@@ -0,0 +1,13 @@
+Reposado is a set of tools written in Python that replicate some of the functionality of Apple's Software Update Service, available as part of Mac OS X Server.
+
+Reposado, together with the "curl" binary tool and a web server such as Apache 2, enables one to host a local Apple Software Update Server on any hardware and OS of your choice.
+
+Reposado contains a tool (repo_sync) to download Software Update catalogs and (optionally) update packages from Apple's servers, enabling you to host them from a local web server.
+
+Additionally, Reposado provides a command-line tool (repoutil) that enables you to create any arbitrary number of "branches" of the Apple catalogs. These branches can contain any subset of the available updates. For example, one could create "testing" and "release" branches, and then set some clients to use the "testing" branch catalog to test newly-released updates. You would set most of your clients to use the "release" branch catalog, which would contain updates that had been through the testing process.
+
+If you configure Reposado to also download the actual updates as well as the catalogs, you can continue to offer updates that have been superseded by more recent updates. For example, if you are currently offering the 10.6.7 updates to your clients, and Apple releases a 10.6.8 update, you can continue to offer the (deprecated) 10.6.7 update until you are ready to release the newer update to your clients. You can even offer the 10.6.7 update to your "release" clients while offering the 10.6.8 update to your "testing" clients. Offering "deprecated" Apple Software Updates is a feature that is difficult with Apple's tools.
+
+Apple's Software Update Service does a few things. Primarily, it replicates software updates from Apple's servers, downloading them to a local machine. Secondly, it functions as a web server to actually serve these updates to client machines. Reposado does not duplicate the web server portion of Apple's Software Update Service. Instead you may use any existing web server you wish.
+
+Reposado also currently relies on the command-line "curl" binary to download updates from Apple's servers. curl is available on OS X, RedHat Linux, and many other OSes, including Win32 and Win64 versions. See http://curl.haxx.se for more information.
diff --git a/docs/getting_started.txt b/docs/getting_started.txt
new file mode 100644
index 0000000..0df044c
--- /dev/null
+++ b/docs/getting_started.txt
@@ -0,0 +1,31 @@
+Getting Started with Reposado
+
+What you need:
+
+- The Reposado tools
+- Python 2.5-2.7 (Reposado has been tested with Python 2.6, but should work with at least 2.5-2.7.)
+- curl binary
+- A web server
+- Storage space for the catalogs and update packages. If you are replicating the update packages, you'll need approximately 40GB of space as of April 2011. The need for space grows as additional updates are released by Apple. If you are only replicating catalogs, you'll probably need less than 100MB of space, though the exact amount of space needed depends on the number of branch catalogs you create.
+
+1) Download the Reposado tools.
+
+2) Create a directory in which to store replicated catalogs and updates, and another to store Reposado metadata. These may share a parent directory, like so:
+ /Volumes/data/reposado/html
+ /Volumes/data/reposado/metadata
+Make sure you have enough space for the replicated catalogs and updates.
+
+3) Configure your web server to serve the contents of the updates root directory you created ("/Volumes/data/reposado/html" in the example above). If you are planning to replicate and serve the actual update packages as well as the catalogs, make note of the URL needed to access the updates root directory via HTTP. This will be the LocalCatalogURLBase when configuring Reposado in the next step.
+
+4) Configure Reposado by creating a preferences.plist in the same directory as the repoutil script. See "reposado_preferences.txt" for details on this file.
+
+5) Run repo_sync to replicate update catalogs and (optionally) update packages to the UpdatesRootDir directory. The first time you do this it may take several hours to complete if you are replicating packages as well as catalogs.
+
+6) Test things so far by visiting a catalog URL in your browser. If http://su.myorg.com is the URL for the updates root directory you created earlier, then http://su.myorg.com/content/catalogs/others/index-leopard-snowleopard.merged-1.sucatalog
+is the CatalogURL for the Snow Leopard updates catalog. You should see a plist version of the updates catalog displayed in your browser.
+
+7) Next test: run softwareupdate on a client, again pointing it to your Catalog URL:
+
+softwareupdate -l --CatalogURL "http://su.myorg.com/content/catalogs/others/index-leopard-snowleopard.merged-1.sucatalog"
+
+If there are no errors, you've successfully configured Reposado and successfully replicated Apple Software Updates.
diff --git a/docs/reference.txt b/docs/reference.txt
new file mode 100644
index 0000000..9477cc2
--- /dev/null
+++ b/docs/reference.txt
@@ -0,0 +1,107 @@
+This is a reference to the command-line options for the repoutil tool.
+
+repoutil --products [--sort ] [--reverse]
+repoutil --updates [--sort ] [--reverse]
+
+List available updates. You may optionally sort by date, title, or id, and optionally reverse the sort order. The default sort order is by post date, so that newest updates are listed last.
+Example:
+
+repoutil --products
+061-1688 Final Cut Express 2.0.3 2005-04-12 []
+061-1704 iMovie HD Update 5.0.2 2005-04-14 []
+061-1702 iDVD Update 5.0.1 2005-04-14 []
+[...]
+zzz041-0453 Security Update 2011-002 1.0 2011-04-14 []
+zzz041-0654 Security Update 2011-002 1.0 2011-04-14 []
+zzz041-0656 Security Update 2011-002 1.0 2011-04-14 []
+041-0560 Safari 5.0.5 2011-04-14 ['testing']
+zzzz041-0565 Safari 5.0.5 2011-04-14 ['testing']
+zzzz041-0531 iTunes 10.2.2 2011-04-18 ['testing']
+zzzz041-0532 iTunes 10.2.2 2011-04-18 ['testing']
+
+
+repoutil --deprecated
+
+List deprecated updates. These are updates that have been superseded by newer versions. Example:
+
+
+repoutil --branches
+
+List available branch catalogs. Example:
+
+repoutil --branches
+release
+testing
+
+repoutil --new-branch BRANCH_NAME
+
+Creates a new empty branch named BRANCH_NAME. Example:
+
+repoutil --new-branch testing
+
+repoutil --delete-branch BRANCH_NAME
+
+Deletes the branch named BRANCH_NAME.
+
+repoutil --copy-branch SOURCE_BRANCH DEST_BRANCH
+
+Copies all items from SOURCE_BRANCH to DEST_BRANCH, completely replacing the contents of DEST_BRANCH with the contents of SOURCE_BRANCH. If DEST_BRANCH does not exist, it will be created.
+
+repoutil --list-branch BRANCH_NAME [--sort ] [--reverse]
+repoutil --list-catalog BRANCH_NAME [--sort ] [--reverse]
+
+List updates in branch BRANCH_NAME. You may optionally sort these.
+
+repoutil--list-branch testing
+zzzz041-0565 Safari 5.0.5 2011-04-14 ['testing']
+041-0560 Safari 5.0.5 2011-04-14 ['testing']
+zzzz041-0532 iTunes 10.2.2 2011-04-18 ['testing']
+zzzz041-0531 iTunes 10.2.2 2011-04-18 ['testing']
+
+repoutil --product-info PRODUCT_ID
+repoutil --info PRODUCT_ID
+
+Prints detailed info on a specific update. Example:
+
+repoutil --info 041-0560
+Product: 041-0560
+Title: Safari
+Version: 5.0.5
+Size: 47.1 MB
+Post Date: 2011-04-14 17:13:16
+AppleCatalogs:
+ http://swscan.apple.com/content/catalogs/index.sucatalog
+ http://swscan.apple.com/content/catalogs/others/index-leopard.merged-1.sucatalog
+ http://swscan.apple.com/content/catalogs/others/index-leopard-snowleopard.merged-1.sucatalog
+Branches:
+ testing
+HTML Description:
+
+
+
+
+
+
+
+
+
+ This update is recommended for all Safari users and includes the latest security updates.
+
+ For information on the security content of this update, please visit this website: http://support.apple.com/kb/HT1222.
+
+
+
+
+
+repoutil --add-product PRODUCT_ID [PRODUCT_ID ...] BRANCH_NAME
+repoutil --add-update=PRODUCT_ID [PRODUCT_ID ...] BRANCH_NAME
+repoutil --add=PRODUCT_ID [PRODUCT_ID ...] BRANCH_NAME
+
+Add one or more PRODUCT_IDs to catalog branch BRANCH_NAME.
+
+repoutil --remove-product=PRODUCT_ID [PRODUCT_ID ...] BRANCH_NAME
+
+Remove one or more PRODUCT_IDs from catalog branch BRANCH_NAME.
diff --git a/docs/reposado_operation.txt b/docs/reposado_operation.txt
new file mode 100644
index 0000000..fdcf41e
--- /dev/null
+++ b/docs/reposado_operation.txt
@@ -0,0 +1,104 @@
+A basic guide to Reposado operation.
+
+See "getting_started.txt" for initial setup, configuration, and testing.
+
+Once you have successfully set up and configured Reposado, you have a local mirror of Apple's Software Update servers. Your clients can use the locally mirrored catalogs (and optionally, update packages) for Apple updates.
+
+You'll almost certainly want to run the repo_sync tool periodically to download catalog and package updates. The exact mechanism by which you might do this varies from platform to platform. On a Linux or other flavor of Unix machine, you'd typically add a cron job to do this. You may also do this on OS X or OS X Server, or you can implement a periodic job, or a launchd task. You may run this as frequently or infrequently as you wish, but Apple's tools sync with Apple once a day, so you might consider that as a guide.
+
+If all you want or need is a local replica of Apple's Software Updates in order to conserve your organization's Internet bandwidth usage, you need not do anything other than the tasks of the initial configuration and setting up periodic execution of the sync script. If you'd like to be able to manage which updates are made available to your client machines, the repoutil tool provides the means to do so.
+
+CatalogURLs
+
+By default, Reposado replicates three update catalogs from Apple's servers:
+
+http://swscan.apple.com/content/catalogs/index.sucatalog
+http://swscan.apple.com/content/catalogs/others/index-leopard.merged-1.sucatalog
+http://swscan.apple.com/content/catalogs/others/index-leopard-snowleopard.merged-1.sucatalog
+
+These are the update catalogs for Tiger, Leopard, and Snow Leopard clients, respectively. You may point any or all of your client machines to the replicated versions of these catalogs. The clients will get the latest updates from Apple as soon as they've been replicated to your Reposado server.
+
+If you'd like to control the availability of updates, you must create "branch" catalogs, which contain subsets of the available updates. You then point some or all of your clients to these branch catalogs instead of the "raw" catalogs that come from Apple.
+
+Creating branch catalogs
+
+Start by creating an empty branch:
+
+repoutil --new-branch testing
+
+This creates a new, empty branch named "testing". This name is appended to the "raw" apple catalog names, so that the CatalogURLs become something like:
+
+http://su.myorg.com/content/catalogs/index_testing.sucatalog
+http://su.myorg.com/content/catalogs/others/index-leopard.merged-1_testing.sucatalog
+http://su.myorg.com/content/catalogs/others/index-leopard-snowleopard.merged-1_testing.sucatalog
+
+..but not until you've added something to the testing branch.
+
+Adding products to a branch catalog
+
+Get a list of products:
+
+repoutil --products
+...long list omitted for brevity...
+041-0560 Safari 5.0.5 2011-04-14 []
+zzzz041-0565 Safari 5.0.5 2011-04-14 []
+zzzz041-0531 iTunes 10.2.2 2011-04-18 []
+zzzz041-0532 iTunes 10.2.2 2011-04-18 []
+
+Add both Safaris and iTunes to testing:
+
+repoutil --add-product 041-0560 zzzz041-0565 zzzz041-0531 zzzz041-0532 testing
+Adding 041-0560 (Safari-5.0.5) to branch testing...
+Adding zzzz041-0565 (Safari-5.0.5) to branch testing...
+Adding zzzz041-0531 (iTunes-10.2.2) to branch testing...
+Adding zzzz041-0532 (iTunes-10.2.2) to branch testing...
+
+And now the testing catalogs are available at URLs similar to those listed above, and the testing catalogs offer only Safari and iTunes.
+
+Removing products from a branch catalog
+
+You can remove products from branch catalogs:
+
+repoutil --remove-product zzzz041-0531 zzzz041-0532 testing
+Removing zzzz041-0531 (iTunes-10.2.2) from branch testing...
+Removing zzzz041-0532 (iTunes-10.2.2) from branch testing...
+
+would remove both iTunes 10.2.2.
+
+You can list the contents of branch catalogs:
+
+repoutil --list-branch testing
+041-0560 Safari 5.0.5 2011-04-14 ['testing']
+zzzz041-0565 Safari 5.0.5 2011-04-14 ['testing']
+
+and copy one branch to another:
+
+repoutil --copy-branch testing release
+Really replace contents of branch release with branch testing? [y/n] y
+Copied contents of branch testing to branch release.
+
+One possible branch catalog workflow
+
+A very small number of your machines are configured to use the "raw" catalogs from Apple. As new updates are released, after a short delay (a day or so?) you add them to the "testing" branch. Your "testing" group of machines are configured to use the "testing" CatalogURLs for their updates. After a time of testing to make sure there are no issues, you add new updates to the "release" branch. Most machines in your organization are configured to use the "release" CatalogURLs.
+
+In this way, new updates are tested before being released to the majority of your machines.
+
+Another use for branch catalogs: if you had a set of machines that must remain on a specific OS release for compatibility with a specific application, you could create one or more special branch catalogs that contained no Mac OS X updates, but only updates to Safari, iTunes, the iLife and iWork apps. In this way you could update the other applications while leaving the OS itself at a specific version.
+
+Deprecated products
+
+Items from Apple's Software Update service can become "deprecated". This commonly occurs when a newer version of an update is made available. For example, when Apple releases a new update to iTunes, all older iTunes updates are deprecated and no longer available from Apple's Software Update servers. Similarly, new updates for Mac OS X cause older updates to be deprecated.
+
+This behavior can sometimes present problems for system administrators. Let's say you had made the 10.6.6 update available to all the machines you manage, and some had updated and some had not yet. Apple then released the Mac OS X 10.6.7 update, which causes the 10.6.6 update to disappear from Apple's update servers. If you are not ready to move to Mac OS X 10.6.7 (because you need testing time), but some of your machines are still running 10.6.5 or earlier, if you are using Apple's Software Update Service there is no way to update those machines to 10.6.6 using Apple's tools.
+
+However, Reposado caches all updates it downloads from Apple and does not automatically remove deprecated updates. This enables you to continue to offer deprecated updates in a branch catalog until you are ready to replace the deprecated update with the new version.
+
+This feature is also a responsibility -- it is the admin's responsibility to remove deprecated products from branch catalogs when adding their updated versions to the same branch. Deprecated products are tagged as such in product listings:
+
+repoutil --list-branch testing
+zzzz061-9636 iTunes 10.2 2011-03-02 ['release', 'testing'] (Deprecated)
+zzzz041-0306 iTunes 10.2 2011-03-02 ['release', 'testing'] (Deprecated)
+
+When adding a newer version of iTunes to the testing or release branches, you'd want to be certain to remove these older, deprecated versions.
+
+
diff --git a/docs/reposado_preferences.txt b/docs/reposado_preferences.txt
new file mode 100644
index 0000000..03a16eb
--- /dev/null
+++ b/docs/reposado_preferences.txt
@@ -0,0 +1,46 @@
+Reposado's configuration is defined in a plist file named "preferences.plist" located in the same directory as the repoutil script.
+
+Two key/values are required:
+
+UpdatesRootDir: a string providing a path where the catalogs and update packages should be stored. Example: /Volumes/data/reposado/html
+
+UpdatesMetadataDir: a string providing a path where metadata used by reposado should be should be stored. Example: /Volumes/data/reposado/metadata
+
+If you are replicating the updates as well as the catalogs, you must also include:
+
+LocalCatalogURLBase
+ This is the "root" URL for your local Software Update repo. Reposado will re-write all product URLs in the update catalogs to use this root URL. For example, a LocalCatalogURLBase of "http://su.myorg.com" will result in a Snow Leopard update catalog URLs like:
+
+http://su.myorg.com/content/catalogs/others/index-leopard-snowleopard.merged-1.sucatalog
+
+If LocalCatalogURLBase is undefined, only Apple catalogs will be replicated and the URLs will not be re-written. This allows you to have custom catalogs for Apple Software Update, but clients will still download the actual update packages from Apple's servers. If Reposado is configured this way, you will not be able to offer deprecated updates to clients.
+
+Optional keys:
+
+AppleCatalogURLs
+ This is an array of strings that specify the Apple SUS catalog URLs to replicate. If this is undefined, it defaults to:
+
+['http://swscan.apple.com/content/catalogs/index.sucatalog',
+ 'http://swscan.apple.com/content/catalogs/others/index-leopard.merged-1.sucatalog',
+ 'http://swscan.apple.com/content/catalogs/others/index-leopard-snowleopard.merged-1.sucatalog']
+
+PreferredLocalizations
+ A list of preferred language localizations for software update descriptions. Defaults to ['English', 'en'].
+
+CurlPath
+ Path to the curl binary tool. Defaults to '/usr/bin/curl'
+
+Example preferences.plist:
+
+
+
+
+
+ LocalCatalogURLBase
+ http://su.myorg.com
+ UpdatesRootDir
+ /Volumes/data/reposado/html
+ UpdatesMetadataDir
+ /Volumes/data/reposado/metadata
+
+
\ No newline at end of file
diff --git a/other/reposado.jpg b/other/reposado.jpg
new file mode 100644
index 0000000..f06bba1
Binary files /dev/null and b/other/reposado.jpg differ
diff --git a/repo_sync b/repo_sync
new file mode 100755
index 0000000..3d7f4b2
--- /dev/null
+++ b/repo_sync
@@ -0,0 +1,621 @@
+#!/usr/bin/env python
+# encoding: utf-8
+#
+# Copyright 2011 Disney Enterprises, Inc. All rights reserved
+#
+# Redistribution and use in source and binary forms, with or without
+# modification, are permitted provided that the following conditions are
+# met:
+
+# * Redistributions of source code must retain the above copyright
+# notice, this list of conditions and the following disclaimer.
+
+# * Redistributions in binary form must reproduce the above copyright
+# notice, this list of conditions and the following disclaimer in
+# the documentation and/or other materials provided with the
+# distribution.
+
+# * The names "Disney", "Walt Disney Pictures", "Walt Disney Animation
+# Studios" or the names of its contributors may NOT be used to
+# endorse or promote products derived from this software without
+# specific prior written permission from Walt Disney Pictures.
+
+# Disclaimer: THIS SOFTWARE IS PROVIDED BY WALT DISNEY PICTURES AND
+# CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING,
+# BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY, FITNESS
+# FOR A PARTICULAR PURPOSE, NONINFRINGEMENT AND TITLE ARE DISCLAIMED.
+# IN NO EVENT SHALL WALT DISNEY PICTURES, THE COPYRIGHT HOLDER OR
+# CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL,
+# EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO,
+# PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
+# PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND BASED ON ANY
+# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGES.
+
+"""
+pysus_sync
+
+Created by Greg Neagle on 2011-03-03.
+"""
+
+import calendar
+import os
+import optparse
+import plistlib
+import subprocess
+#import sys
+import time
+import tempfile
+import urlparse
+from xml.dom import minidom
+#from xml.parsers.expat import ExpatError
+
+from reposadolib import reposadocommon
+
+
+def parseServerMetadata(filename):
+ '''Parses a softwareupdate server metadata file, looking for information
+ of interest.
+ Returns a dictionary containing title, version, and description.'''
+ title = ''
+ vers = ''
+ description = ''
+ md_plist = plistlib.readPlist(filename)
+ vers = md_plist.get('CFBundleShortVersionString','')
+ localization = md_plist.get('localization', {})
+ languages = localization.keys()
+ preferred_lang = getPreferredLocalization(languages)
+ preferred_localization = localization.get(preferred_lang)
+ if preferred_localization:
+ title = preferred_localization.get('title','')
+ encoded_description = preferred_localization.get('description','')
+ if encoded_description:
+ description = str(encoded_description)
+
+ metadata = {}
+ metadata['title'] = title
+ metadata['version'] = vers
+ metadata['description'] = description
+ return metadata
+
+
+def parseSUdist(filename):
+ '''Parses a softwareupdate dist file, looking for infomation of interest.
+ Returns a dictionary containing title, version, and description.'''
+ su_name = ""
+ title = ""
+
+ dom = minidom.parse(filename)
+
+ choice_elements = dom.getElementsByTagName("choice")
+ if choice_elements:
+ for choice in choice_elements:
+ keys = choice.attributes.keys()
+ if 'suDisabledGroupID' in keys:
+ # this is the name as displayed from
+ # /usr/sbin/softwareupdate -l
+ su_name = choice.attributes[
+ 'suDisabledGroupID'].value
+
+ text = ""
+ localizations = dom.getElementsByTagName('localization')
+ if localizations:
+ string_elements = localizations[0].getElementsByTagName('strings')
+ if string_elements:
+ strings = string_elements[0]
+ if strings.firstChild:
+ text = strings.firstChild.wholeText
+
+ # get title, version and description as displayed in Software Update
+ title = vers = description = ""
+ keep = False
+ for line in text.split('\n'):
+ if line.startswith('"SU_TITLE"'):
+ title = line[10:]
+ title = title[title.find('"')+1:-2]
+ if line.startswith('"SU_VERS"'):
+ vers = line[9:]
+ vers = vers[vers.find('"')+1:-2]
+ if line.startswith('"SU_VERSION"'):
+ vers = line[12:]
+ vers = vers[vers.find('"')+1:-2]
+ if line.startswith('"SU_DESCRIPTION"'):
+ description = ""
+ keep = True
+ # lop off "SU_DESCRIPTION"
+ line = line[16:]
+ # lop off everything up through '
+ line = line[line.find("'")+1:]
+
+ if keep:
+ # replace escaped single quotes
+ line = line.replace("\\'","'")
+ if line == "';":
+ # we're done
+ break
+ elif line.endswith("';"):
+ # done
+ description += line[0:-2]
+ break
+ else:
+ # append the line to the description
+ description += line + "\n"
+
+ dist = {}
+ dist['su_name'] = su_name
+ dist['title'] = title
+ dist['version'] = vers
+ dist['description'] = description
+ return dist
+
+
+class CurlError(Exception):
+ '''curl returned an error we can't handle'''
+ pass
+
+
+class HTTPError(Exception):
+ '''curl returned an HTTP error we can't handle'''
+ pass
+
+
+class CurlDownloadError(Exception):
+ """Curl failed to download the item"""
+ pass
+
+
+def curl(url, destinationpath, onlyifnewer=False, etag=None, resume=False):
+ """Gets an HTTP or HTTPS URL and stores it in
+ destination path. Returns a dictionary of headers, which includes
+ http_result_code and http_result_description.
+ Will raise CurlError if curl returns an error.
+ Will raise HTTPError if HTTP Result code is not 2xx or 304.
+ If destinationpath already exists, you can set 'onlyifnewer' to true to
+ indicate you only want to download the file only if it's newer on the
+ server.
+ If you have an ETag from the current destination path, you can pass that
+ to download the file only if it is different.
+ Finally, if you set resume to True, curl will attempt to resume an
+ interrupted download. You'll get an error if the existing file is
+ complete; if the file has changed since the first download attempt, you'll
+ get a mess."""
+
+ header = {}
+ header['http_result_code'] = '000'
+ header['http_result_description'] = ""
+
+ curldirectivepath = os.path.join(TMPDIR, 'curl_temp')
+ tempdownloadpath = destinationpath + '.download'
+
+ # we're writing all the curl options to a file and passing that to
+ # curl so we avoid the problem of URLs showing up in a process listing
+ try:
+ fileobj = open(curldirectivepath, mode='w')
+ print >> fileobj, 'silent' # no progress meter
+ print >> fileobj, 'show-error' # print error msg to stderr
+ print >> fileobj, 'no-buffer' # don't buffer output
+ print >> fileobj, 'fail' # throw error if download fails
+ print >> fileobj, 'dump-header -' # dump headers to stdout
+ print >> fileobj, 'speed-time = 30' # give up if too slow d/l
+ print >> fileobj, 'output = "%s"' % tempdownloadpath
+ print >> fileobj, 'ciphers = HIGH,!ADH' # use only >=128 bit SSL
+ print >> fileobj, 'url = "%s"' % url
+
+ if os.path.exists(tempdownloadpath):
+ if resume:
+ # let's try to resume this download
+ print >> fileobj, 'continue-at -'
+ else:
+ os.remove(tempdownloadpath)
+
+ if os.path.exists(destinationpath):
+ if etag:
+ escaped_etag = etag.replace('"','\\"')
+ print >> fileobj, ('header = "If-None-Match: %s"'
+ % escaped_etag)
+ elif onlyifnewer:
+ print >> fileobj, 'time-cond = "%s"' % destinationpath
+ else:
+ os.remove(destinationpath)
+
+ fileobj.close()
+ except Exception, e:
+ raise CurlError(-5, 'Error writing curl directive: %s' % str(e))
+
+ cmd = [reposadocommon.pref('CurlPath'),
+ '-q', # don't read .curlrc file
+ '--config', # use config file
+ curldirectivepath]
+
+ proc = subprocess.Popen(cmd, shell=False, bufsize=1,
+ stdin=subprocess.PIPE,
+ stdout=subprocess.PIPE, stderr=subprocess.PIPE)
+
+ targetsize = 0
+ downloadedpercent = -1
+ donewithheaders = False
+ printed_message = False
+
+ while True:
+ if not donewithheaders:
+ info = proc.stdout.readline().strip('\r\n')
+ if info:
+ if info.startswith('HTTP/'):
+ header['http_result_code'] = info.split(None, 2)[1]
+ header['http_result_description'] = info.split(None, 2)[2]
+ elif ': ' in info:
+ part = info.split(None, 1)
+ fieldname = part[0].rstrip(':').lower()
+ header[fieldname] = part[1]
+ else:
+ # we got an empty line; end of headers (or curl exited)
+ donewithheaders = True
+ try:
+ targetsize = int(header.get('content-length'))
+ except (ValueError, TypeError):
+ targetsize = 0
+ if header.get('http_result_code') == '206':
+ # partial content because we're resuming
+ reposadocommon.print_stderr(
+ 'Resuming partial download for %s',
+ os.path.basename(destinationpath))
+ contentrange = header.get('content-range')
+ if contentrange.startswith('bytes'):
+ try:
+ targetsize = int(contentrange.split('/')[1])
+ except (ValueError, TypeError):
+ targetsize = 0
+
+ elif targetsize and header.get('http_result_code').startswith('2'):
+ if not printed_message:
+ printed_message = True
+ reposadocommon.print_stdout('Downloading %s bytes from %s...',
+ targetsize, url)
+ time.sleep(0.1)
+
+ if (proc.poll() != None):
+ break
+
+ retcode = proc.poll()
+ if retcode:
+ curlerr = proc.stderr.read().rstrip('\n').split(None, 2)[2]
+ if os.path.exists(tempdownloadpath):
+ if (not resume) or (retcode == 33):
+ # 33 means server doesn't support range requests
+ # and so cannot resume downloads, so
+ os.remove(tempdownloadpath)
+ raise CurlError(retcode, curlerr)
+ else:
+ temp_download_exists = os.path.isfile(tempdownloadpath)
+ http_result = header.get('http_result_code')
+ if downloadedpercent != 100 and \
+ http_result.startswith('2') and \
+ temp_download_exists:
+ downloadedsize = os.path.getsize(tempdownloadpath)
+ if downloadedsize >= targetsize:
+ os.rename(tempdownloadpath, destinationpath)
+ return header
+ else:
+ # not enough bytes retreived
+ if not resume and temp_download_exists:
+ os.remove(tempdownloadpath)
+ raise CurlError(-5, 'Expected %s bytes, got: %s' %
+ (targetsize, downloadedsize))
+ elif http_result.startswith('2') and temp_download_exists:
+ os.rename(tempdownloadpath, destinationpath)
+ return header
+ elif http_result == '304':
+ return header
+ else:
+ # there was a download error of some sort; clean all relevant
+ # downloads that may be in a bad state.
+ for f in [tempdownloadpath, destinationpath]:
+ try:
+ os.unlink(f)
+ except OSError:
+ pass
+ raise HTTPError(http_result,
+ header.get('http_result_description',''))
+
+
+def getURL(url, destination_path):
+ '''Downloads a file from url to destination_path, checking existing
+ files by mode date or etag'''
+ if os.path.exists(destination_path):
+ saved_etag = get_saved_etag(url)
+ else:
+ saved_etag = None
+ try:
+ header = curl(url, destination_path,
+ onlyifnewer=True, etag=saved_etag)
+ except CurlError, err:
+ err = 'Error %s: %s' % tuple(err)
+ raise CurlDownloadError(err)
+
+ except HTTPError, err:
+ err = 'HTTP result %s: %s' % tuple(err)
+ raise CurlDownloadError(err)
+
+ err = None
+ if header['http_result_code'] == '304':
+ # not modified; what we have is correct
+ #print >> sys.stderr, ('%s is already downloaded.' % url)
+ pass
+ else:
+ if header.get('last-modified'):
+ # set the modtime of the downloaded file to the modtime of the
+ # file on the server
+ modtimestr = header['last-modified']
+ modtimetuple = time.strptime(modtimestr,
+ '%a, %d %b %Y %H:%M:%S %Z')
+ modtimeint = calendar.timegm(modtimetuple)
+ os.utime(destination_path, (time.time(), modtimeint))
+ if header.get('etag'):
+ # store etag for future use
+ record_etag(url, header['etag'])
+
+
+_ETAG = {}
+def get_saved_etag(url):
+ '''Retrieves a saved etag'''
+ global _ETAG
+ if _ETAG == {}:
+ reposadocommon.getDataFromPlist('ETags.plist')
+ if url in _ETAG:
+ return _ETAG[url]
+ else:
+ return None
+
+
+def record_etag(url, etag):
+ '''Saves an etag in our internal dict'''
+ global _ETAG
+ _ETAG[url] = etag
+
+
+def writeEtagDict():
+ '''Writes our stored etags to disk'''
+ reposadocommon.writeDataToPlist(_ETAG, 'ETags.plist')
+
+
+class ReplicationError (Exception):
+ '''A custom error when replication fails'''
+ pass
+
+
+def replicateURLtoFilesystem(full_url, root_dir=None,
+ base_url=None, copy_only_if_missing=False,
+ appendToFilename=''):
+ '''Downloads a URL and stores it in the same relative path on our
+ filesystem. Returns a path to the replicated file.'''
+
+ if root_dir == None:
+ root_dir = reposadocommon.pref('UpdatesRootDir')
+
+ if base_url:
+ if not full_url.startswith(base_url):
+ raise ReplicationError('%s is not a resource in %s' %
+ (full_url, base_url))
+ relative_url = full_url[len(base_url):].lstrip('/')
+ else:
+ (unused_scheme, unused_netloc,
+ path, unused_query, unused_fragment) = urlparse.urlsplit(full_url)
+ relative_url = path.lstrip('/')
+
+ local_file_path = os.path.join(root_dir, relative_url) + appendToFilename
+ local_dir_path = os.path.dirname(local_file_path)
+ if copy_only_if_missing and os.path.exists(local_file_path):
+ return local_file_path
+ if not os.path.exists(local_dir_path):
+ try:
+ os.makedirs(local_dir_path)
+ except OSError, oserr:
+ raise ReplicationError(oserr)
+ try:
+ getURL(full_url, local_file_path)
+ except CurlDownloadError, e:
+ raise ReplicationError(e)
+ return local_file_path
+
+
+class ArchiveError (Exception):
+ '''A custom error when archiving fails'''
+ pass
+
+
+def archiveCatalog(catalogpath):
+ '''Makes a copy of our catalog in our archive folder,
+ marking with a date'''
+ archivedir = os.path.join(os.path.dirname(catalogpath), 'archive')
+ if not os.path.exists(archivedir):
+ try:
+ os.makedirs(archivedir)
+ except OSError, oserr:
+ raise ArchiveError(oserr)
+ # get modtime of original file
+ modtime = int(os.stat(catalogpath).st_mtime)
+ # make a string from the mod time
+ modtimestring = time.strftime('.%Y-%m-%d-%H%M%S', time.localtime(modtime))
+ catalogname = os.path.basename(catalogpath)
+ # remove the '.apple' from the end of the catalogname
+ if catalogname.endswith('.apple'):
+ catalogname = catalogname[0:-6]
+ archivepath = os.path.join(archivedir, catalogname + modtimestring)
+ if not os.path.exists(archivepath):
+ catalog = plistlib.readPlist(catalogpath)
+ plistlib.writePlist(catalog, archivepath)
+ # might as well set the mod time of the archive file to match
+ os.utime(archivepath, (time.time(), modtime))
+
+
+def getPreferredLocalization(list_of_localizations):
+ '''Picks the best localization from a list of available
+ localizations. If we're running on OS X, we use
+ NSBundle.preferredLocalizationsFromArray_forPreferences_,
+ else we look for PreferredLocalizations in our preferences'''
+ try:
+ from Foundation import NSBundle
+ except ImportError:
+ # Foundation NSBundle isn't available, use prefs instead
+ languages = (reposadocommon.pref('PreferredLocalizations')
+ or ['English', 'en'])
+ for language in languages:
+ if language in list_of_localizations:
+ return language
+ else:
+ preferred_langs = \
+ NSBundle.preferredLocalizationsFromArray_forPreferences_(
+ list_of_localizations, None)
+ if preferred_langs:
+ return preferred_langs[0]
+
+ if 'English' in list_of_localizations:
+ return 'English'
+ elif 'en' in list_of_localizations:
+ return 'en'
+ return None
+
+
+TMPDIR = None
+def sync(fast_scan=False, download_packages=True):
+ '''Syncs Apple's Software Updates with our local store.
+ Returns a dictionary of products.'''
+ global TMPDIR
+ TMPDIR = tempfile.mkdtemp()
+
+ catalog_urls = reposadocommon.pref('AppleCatalogURLs')
+ products = reposadocommon.getProductInfo()
+
+ # clear cached AppleCatalog listings
+ for item in products.keys():
+ products[item]['AppleCatalogs'] = []
+ replicated_products = []
+
+ for catalog_url in catalog_urls:
+ localcatalogpath = \
+ reposadocommon.getLocalPathNameFromURL(catalog_url) + '.apple'
+ if os.path.exists(localcatalogpath):
+ archiveCatalog(localcatalogpath)
+ localcatalogpath = replicateURLtoFilesystem(catalog_url,
+ appendToFilename='.apple')
+ catalog = plistlib.readPlist(localcatalogpath)
+ if 'Products' in catalog:
+ product_keys = list(catalog['Products'].keys())
+ reposadocommon.print_stdout('%s products found in %s',
+ len(product_keys), catalog_url)
+ product_keys.sort()
+ for product_key in product_keys:
+ if product_key in replicated_products:
+ products[product_key]['AppleCatalogs'].append(
+ catalog_url)
+ else:
+ if not product_key in products:
+ products[product_key] = {}
+ products[product_key]['AppleCatalogs'] = [catalog_url]
+ product = catalog['Products'][product_key]
+ products[product_key]['CatalogEntry'] = product
+ server_metadata = None
+ if download_packages and 'ServerMetadataURL' in product:
+ try:
+ unused_path = replicateURLtoFilesystem(
+ product['ServerMetadataURL'],
+ copy_only_if_missing=fast_scan)
+ except ReplicationError:
+ continue
+
+ if download_packages:
+ try:
+ for package in product.get('Packages', []):
+ # TO-DO: Check 'Size' attribute and make sure
+ # we have enough space on the target
+ # filesystem before attempting to download
+ if 'URL' in package:
+ unused_path = replicateURLtoFilesystem(
+ package['URL'],
+ copy_only_if_missing=fast_scan)
+ if 'MetadataURL' in package:
+ unused_path = replicateURLtoFilesystem(
+ package['MetadataURL'],
+ copy_only_if_missing=fast_scan)
+ except ReplicationError:
+ continue
+
+ # calculate total size
+ size = 0
+ for package in product.get('Packages', []):
+ size += package.get('Size', 0)
+
+ distributions = product['Distributions']
+ preferred_lang = getPreferredLocalization(
+ distributions.keys())
+ preferred_dist = None
+
+ try:
+ for dist_lang in distributions.keys():
+ dist_url = distributions[dist_lang]
+ if (download_packages or
+ dist_lang == preferred_lang):
+ dist_path = replicateURLtoFilesystem(
+ dist_url,
+ copy_only_if_missing=fast_scan)
+ if dist_lang == preferred_lang:
+ preferred_dist = dist_path
+ except ReplicationError:
+ continue
+
+ if preferred_dist:
+ dist = parseSUdist(preferred_dist)
+ products[product_key]['title'] = dist['title']
+ products[product_key]['version'] = dist['version']
+ products[product_key]['size'] = size
+ products[product_key]['description'] = \
+ dist['description']
+ products[product_key]['PostDate'] = \
+ product['PostDate']
+ else:
+ products[product_key]['PostDate'] = \
+ product['PostDate']
+
+ # if we got this far, we've replicated the product data
+ replicated_products.append(product_key)
+
+ # record original catalogs in case the product is
+ # deprecated in the future
+ if not 'OriginalAppleCatalogs' in products[product_key]:
+ products[product_key]['OriginalAppleCatalogs'] = \
+ products[product_key]['AppleCatalogs']
+
+ # record products we've successfully downloaded
+ reposadocommon.writeDownloadStatus(replicated_products)
+ # write our ETags to disk for future use
+ writeEtagDict()
+ # record our product cache
+ reposadocommon.writeProductInfo(products)
+ # write our local (filtered) catalogs
+ reposadocommon.writeLocalCatalogs(localcatalogpath)
+
+ # clean up tmpdir
+ # TO-DO: write the cleanup function
+ # cleanup(TMPDIR)
+
+
+def main():
+ '''Main command processing'''
+ p = optparse.OptionParser()
+ p.set_usage('''Usage: %prog [options]''')
+ p.add_option('--recheck', action='store_true',
+ help="""Recheck already downloaded packages for changes.""")
+ options, arguments = p.parse_args()
+ if reposadocommon.validPreferences():
+ if reposadocommon.pref('LocalCatalogURLBase') == None:
+ download_packages = False
+ else:
+ download_packages = True
+ sync(fast_scan=(not options.recheck),
+ download_packages=download_packages)
+
+
+if __name__ == '__main__':
+ main()
+
diff --git a/reposadolib/.DS_Store b/reposadolib/.DS_Store
new file mode 100644
index 0000000..5008ddf
Binary files /dev/null and b/reposadolib/.DS_Store differ
diff --git a/reposadolib/__init__.py b/reposadolib/__init__.py
new file mode 100644
index 0000000..5e7beab
--- /dev/null
+++ b/reposadolib/__init__.py
@@ -0,0 +1,3 @@
+# this is needed to make Python recognize the directory as a module package.
+#
+# Warning: do NOT put any Python imports here that require ObjC.
diff --git a/reposadolib/__init__.pyc b/reposadolib/__init__.pyc
new file mode 100644
index 0000000..db940f0
Binary files /dev/null and b/reposadolib/__init__.pyc differ
diff --git a/reposadolib/reposadocommon.py b/reposadolib/reposadocommon.py
new file mode 100644
index 0000000..0dd6465
--- /dev/null
+++ b/reposadolib/reposadocommon.py
@@ -0,0 +1,363 @@
+#!/usr/bin/env python
+# encoding: utf-8
+#
+# Copyright 2011 Disney Enterprises, Inc. All rights reserved
+#
+# Redistribution and use in source and binary forms, with or without
+# modification, are permitted provided that the following conditions are
+# met:
+
+# * Redistributions of source code must retain the above copyright
+# notice, this list of conditions and the following disclaimer.
+
+# * Redistributions in binary form must reproduce the above copyright
+# notice, this list of conditions and the following disclaimer in
+# the documentation and/or other materials provided with the
+# distribution.
+
+# * The names "Disney", "Walt Disney Pictures", "Walt Disney Animation
+# Studios" or the names of its contributors may NOT be used to
+# endorse or promote products derived from this software without
+# specific prior written permission from Walt Disney Pictures.
+
+# Disclaimer: THIS SOFTWARE IS PROVIDED BY WALT DISNEY PICTURES AND
+# CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING,
+# BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY, FITNESS
+# FOR A PARTICULAR PURPOSE, NONINFRINGEMENT AND TITLE ARE DISCLAIMED.
+# IN NO EVENT SHALL WALT DISNEY PICTURES, THE COPYRIGHT HOLDER OR
+# CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL,
+# EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO,
+# PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
+# PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND BASED ON ANY
+# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGES.
+
+"""
+pysuscommon.py
+
+Created by Greg Neagle on 2011-03-03.
+"""
+
+import sys
+import os
+import plistlib
+import urlparse
+import warnings
+from xml.parsers.expat import ExpatError
+
+
+def prefsFilePath():
+ '''Returns path to our preferences file.'''
+ mydir = os.path.dirname(os.path.abspath(__file__))
+ parentdir = os.path.dirname(mydir)
+ return os.path.join(parentdir, 'preferences.plist')
+
+
+def pref(prefname):
+ '''Returns a preference.'''
+ default_prefs = {
+ 'AppleCatalogURLs': ['http://swscan.apple.com/content/catalogs/index.sucatalog',
+'http://swscan.apple.com/content/catalogs/others/index-leopard.merged-1.sucatalog',
+'http://swscan.apple.com/content/catalogs/others/index-leopard-snowleopard.merged-1.sucatalog'],
+ 'PreferredLocalizations': ['English', 'en'],
+ 'CurlPath': '/usr/bin/curl'
+ }
+ try:
+ prefs = plistlib.readPlist(prefsFilePath())
+ except (IOError, ExpatError):
+ prefs = default_prefs
+ if prefname in prefs:
+ return prefs[prefname]
+ elif prefname in default_prefs:
+ return default_prefs[prefname]
+ else:
+ return None
+
+
+def validPreferences():
+ '''Validates our preferences to make sure needed values are defined
+ and paths exist. Returns boolean.'''
+ prefs_valid = True
+ if not pref('UpdatesRootDir'):
+ print_stderr('ERROR: UpdatesRootDir is not defined in %s.',
+ prefsFilePath())
+ prefs_valid = False
+ if not pref('UpdatesMetadataDir'):
+ print_stderr('ERROR: UpdatesMetadataDir is not defined in %s.',
+ prefsFilePath())
+ prefs_valid = False
+ return prefs_valid
+
+
+def str_to_ascii(s):
+ """Given str (unicode, latin-1, or not) return ascii.
+
+ Args:
+ s: str, likely in Unicode-16BE, UTF-8, or Latin-1 charset
+ Returns:
+ str, ascii form, no >7bit chars
+ """
+ try:
+ return unicode(s).encode('ascii', 'ignore')
+ except UnicodeDecodeError:
+ return s.decode('ascii', 'ignore')
+
+
+def concat_message(msg, *args):
+ """Concatenates a string with any additional arguments; drops unicode."""
+ if args:
+ args = [str_to_ascii(arg) for arg in args]
+ try:
+ msg = msg % tuple(args)
+ except TypeError:
+ warnings.warn(
+ 'String format does not match concat args: %s' % (
+ str(sys.exc_info())))
+ return msg
+
+
+def print_stdout(msg, *args):
+ """
+ Prints message and args to stdout.
+ """
+ print concat_message(msg, *args)
+ sys.stdout.flush()
+
+
+def print_stderr(msg, *args):
+ """
+ Prints message and args to stderr.
+ """
+ print >> sys.stderr, concat_message(msg, *args)
+
+
+def writeDataToPlist(data, filename):
+ '''Writes a dict or list to a plist in our metadata dir'''
+ metadata_dir = pref('UpdatesMetadataDir')
+ if not os.path.exists(metadata_dir):
+ try:
+ os.makedirs(metadata_dir)
+ except OSError, errmsg:
+ print_stderr(
+ 'Could not create missing %s because %s',
+ metadata_dir, errmsg)
+ try:
+ plistlib.writePlist(data,
+ os.path.join(metadata_dir, filename))
+ except (IOError, OSError), errmsg:
+ print_stderr(
+ 'Could not write %s because %s', filename, errmsg)
+
+
+def getDataFromPlist(filename):
+ '''Reads data from a plist in our metadata dir'''
+ metadata_dir = pref('UpdatesMetadataDir')
+ try:
+ return plistlib.readPlist(
+ os.path.join(metadata_dir, filename))
+ except (IOError, ExpatError):
+ return {}
+
+
+def getDownloadStatus():
+ '''Reads download status info from disk'''
+ return getDataFromPlist('DownloadStatus.plist')
+
+
+def writeDownloadStatus(download_status_list):
+ '''Writes download status info to disk'''
+ writeDataToPlist(download_status_list, 'DownloadStatus.plist')
+
+
+def getCatalogBranches():
+ '''Reads catalog branches info from disk'''
+ return getDataFromPlist('CatalogBranches.plist')
+
+
+def writeCatalogBranches(catalog_branches):
+ '''Writes catalog branches info to disk'''
+ writeDataToPlist(catalog_branches, 'CatalogBranches.plist')
+
+
+def getProductInfo():
+ '''Reads Software Update product info from disk'''
+ return getDataFromPlist('ProductInfo.plist')
+
+
+def writeProductInfo(product_info_dict):
+ '''Writes Software Update product info to disk'''
+ writeDataToPlist(product_info_dict, 'ProductInfo.plist')
+
+
+def getFilenameFromURL(url):
+ '''Gets the filename from a URL'''
+ (unused_scheme, unused_netloc,
+ path, unused_query, unused_fragment) = urlparse.urlsplit(url)
+ return os.path.basename(path)
+
+
+def getLocalPathNameFromURL(url, root_dir=None):
+ '''Derives the appropriate local path name based on the URL'''
+ if root_dir == None:
+ root_dir = pref('UpdatesRootDir')
+ (unused_scheme, unused_netloc,
+ path, unused_query, unused_fragment) = urlparse.urlsplit(url)
+ relative_path = path.lstrip('/')
+ return os.path.join(root_dir, relative_path)
+
+
+def rewriteOneURL(full_url):
+ '''Rewrites a single URL to point to our local replica'''
+ our_base_url = pref('LocalCatalogURLBase')
+ if not full_url.startswith(our_base_url):
+ # only rewrite the URL if needed
+ (unused_scheme, unused_netloc,
+ path, unused_query, unused_fragment) = urlparse.urlsplit(full_url)
+ return our_base_url + path
+ else:
+ return full_url
+
+
+def rewriteURLsForProduct(product):
+ '''Rewrites the URLs for a product'''
+ if 'ServerMetadataURL' in product:
+ product['ServerMetadataURL'] = rewriteOneURL(
+ product['ServerMetadataURL'])
+ for package in product.get('Packages', []):
+ if 'URL' in package:
+ package['URL'] = rewriteOneURL(package['URL'])
+ if 'MetadataURL' in package:
+ package['MetadataURL'] = rewriteOneURL(
+ package['MetadataURL'])
+ distributions = product['Distributions']
+ for dist_lang in distributions.keys():
+ distributions[dist_lang] = rewriteOneURL(
+ distributions[dist_lang])
+
+
+def rewriteURLs(catalog):
+ '''Rewrites all the URLs in the given catalog to point to our local
+ replica'''
+ if pref('LocalCatalogURLBase') == None:
+ return
+ if 'Products' in catalog:
+ product_keys = list(catalog['Products'].keys())
+ for product_key in product_keys:
+ product = catalog['Products'][product_key]
+ rewriteURLsForProduct(product)
+
+
+def writeAllBranchCatalogs():
+ '''Writes out all branch catalogs. Used when we edit branches.'''
+ for catalog_URL in pref('AppleCatalogURLs'):
+ localcatalogpath = getLocalPathNameFromURL(catalog_URL)
+ writeBranchCatalogs(localcatalogpath)
+
+
+def writeBranchCatalogs(localcatalogpath):
+ '''Writes our branch catalogs'''
+ catalog = plistlib.readPlist(localcatalogpath)
+ downloaded_products = catalog['Products']
+ product_info = getProductInfo()
+
+ localcatalogname = os.path.basename(localcatalogpath)
+ # now strip the '.sucatalog' bit from the name
+ # so we can use it to construct our branch catalog names
+ if localcatalogpath.endswith('.sucatalog'):
+ localcatalogpath = localcatalogpath[0:-10]
+
+ # now write filtered catalogs (branches)
+ catalog_branches = getCatalogBranches()
+ for branch in catalog_branches.keys():
+ catalog['Products'] = {}
+ for product_key in catalog_branches[branch]:
+ if product_key in downloaded_products.keys():
+ # add the product to the Products dict
+ # for this catalog
+ catalog['Products'][product_key] = \
+ downloaded_products[product_key]
+ elif pref('LocalCatalogURLBase') and product_key in product_info:
+ # Product has probably been deprecated by Apple,
+ # so we're using cached product info
+ # First check to see if this product was ever in this
+ # catalog
+ original_catalogs = product_info[product_key].get(
+ 'OriginalAppleCatalogs',[])
+ for original_catalog in original_catalogs:
+ if original_catalog.endswith(localcatalogname):
+ # this item was originally in this catalog, so
+ # we can add it to the branch
+ catalog_entry = \
+ product_info[product_key].get('CatalogEntry')
+ title = product_info[product_key].get('title')
+ version = product_info[product_key].get('version')
+ if catalog_entry:
+ print_stderr(
+ 'WARNING: Product %s (%s-%s) in branch %s '
+ 'has been deprecated. Will use cached info '
+ 'and packages.',
+ product_key, title, version, branch)
+ rewriteURLsForProduct(catalog_entry)
+ catalog['Products'][product_key] = catalog_entry
+ continue
+ else:
+ if pref('LocalCatalogURLBase') :
+ print_stderr(
+ 'WARNING: Product %s not added to branch %s of %s. '
+ 'It is not in the corresponding Apple catalogs '
+ 'and is not in the ProductInfo cache.',
+ product_key, branch, localcatalogname)
+ else:
+ print_stderr(
+ 'WARNING: Product %s not added to branch %s of %s. '
+ 'It is not in the corresponding Apple catalog.',
+ product_key, branch, localcatalogname)
+
+ branchcatalogpath = localcatalogpath + '_' + branch + '.sucatalog'
+ plistlib.writePlist(catalog, branchcatalogpath)
+
+
+def writeLocalCatalogs(applecatalogpath):
+ '''Writes our local catalogs based on the Apple catalog'''
+ catalog = plistlib.readPlist(applecatalogpath)
+ # rewrite the URLs within the catalog to point to the items on our
+ # local server instead of Apple's
+ rewriteURLs(catalog)
+ # remove the '.apple' from the end of the localcatalogpath
+ if applecatalogpath.endswith('.apple'):
+ localcatalogpath = applecatalogpath[0:-6]
+ else:
+ localcatalogpath = applecatalogpath
+
+ downloaded_products_list = getDownloadStatus()
+
+ downloaded_products = {}
+ product_keys = list(catalog['Products'].keys())
+ # filter Products, removing those that haven't been downloaded
+ for product_key in product_keys:
+ if product_key in downloaded_products_list:
+ downloaded_products[product_key] = \
+ catalog['Products'][product_key]
+ else:
+ print_stderr('WARNING: did not add product %s to '
+ 'catalog %s because it has not been downloaded.',
+ product_key, os.path.basename(applecatalogpath))
+ catalog['Products'] = downloaded_products
+
+ # write raw (unstable/development) catalog
+ # with all downloaded Apple updates enabled
+ plistlib.writePlist(catalog, localcatalogpath)
+
+ # now write filtered catalogs (branches) based on this catalog
+ writeBranchCatalogs(localcatalogpath)
+
+
+def main():
+ '''Placeholder'''
+ pass
+
+
+if __name__ == '__main__':
+ main()
+
diff --git a/reposadolib/reposadocommon.pyc b/reposadolib/reposadocommon.pyc
new file mode 100644
index 0000000..c3fe555
Binary files /dev/null and b/reposadolib/reposadocommon.pyc differ
diff --git a/repoutil b/repoutil
new file mode 100755
index 0000000..a5d2a18
--- /dev/null
+++ b/repoutil
@@ -0,0 +1,401 @@
+#!/usr/bin/env python
+# encoding: utf-8
+#
+# Copyright 2011 Disney Enterprises, Inc. All rights reserved
+#
+# Redistribution and use in source and binary forms, with or without
+# modification, are permitted provided that the following conditions are
+# met:
+
+# * Redistributions of source code must retain the above copyright
+# notice, this list of conditions and the following disclaimer.
+
+# * Redistributions in binary form must reproduce the above copyright
+# notice, this list of conditions and the following disclaimer in
+# the documentation and/or other materials provided with the
+# distribution.
+
+# * The names "Disney", "Walt Disney Pictures", "Walt Disney Animation
+# Studios" or the names of its contributors may NOT be used to
+# endorse or promote products derived from this software without
+# specific prior written permission from Walt Disney Pictures.
+
+# Disclaimer: THIS SOFTWARE IS PROVIDED BY WALT DISNEY PICTURES AND
+# CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING,
+# BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY, FITNESS
+# FOR A PARTICULAR PURPOSE, NONINFRINGEMENT AND TITLE ARE DISCLAIMED.
+# IN NO EVENT SHALL WALT DISNEY PICTURES, THE COPYRIGHT HOLDER OR
+# CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL,
+# EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO,
+# PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
+# PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND BASED ON ANY
+# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGES.
+
+'''A tool to replicate most of the functionality of
+Apple Software Update server'''
+
+import optparse
+import os
+#import sys
+
+from reposadolib import reposadocommon
+
+
+def deleteBranchCatalogs(branchname):
+ '''Removes catalogs corresponding to a deleted branch'''
+ for catalog_URL in reposadocommon.pref('AppleCatalogURLs'):
+ localcatalogpath = reposadocommon.getLocalPathNameFromURL(catalog_URL)
+ # now strip the '.sucatalog' bit from the name
+ if localcatalogpath.endswith('.sucatalog'):
+ localcatalogpath = localcatalogpath[0:-10]
+ branchcatalogpath = localcatalogpath + '_' + branchname + '.sucatalog'
+ if os.path.exists(branchcatalogpath):
+ os.remove(branchcatalogpath)
+
+
+def get_info(key):
+ '''Prints detail for a specific product'''
+ products = reposadocommon.getProductInfo()
+ if key in products:
+ downloaded_products_list = reposadocommon.getDownloadStatus()
+ if key in downloaded_products_list:
+ status = "Downloaded"
+ else:
+ status = "Not downloaded"
+ catalog_branches = reposadocommon.getCatalogBranches()
+ branchlist = []
+ for branch in catalog_branches.keys():
+ if key in catalog_branches[branch]:
+ branchlist.append(branch)
+
+ reposadocommon.print_stdout('Product: %s', key)
+ reposadocommon.print_stdout('Title: %s', products[key]['title'])
+ reposadocommon.print_stdout('Version: %s', products[key]['version'])
+ reposadocommon.print_stdout('Size: %s',
+ humanReadable(products[key]['size']))
+ reposadocommon.print_stdout(
+ 'Post Date: %s', products[key]['PostDate'])
+ if reposadocommon.pref('LocalCatalogURLBase'):
+ reposadocommon.print_stdout('Status: %s', status)
+ if products[key].get('AppleCatalogs'):
+ reposadocommon.print_stdout('AppleCatalogs:')
+ for catalog in products[key]['AppleCatalogs']:
+ reposadocommon.print_stdout(' %s', catalog)
+ else:
+ print ' Product is deprecated.'
+ if products[key].get('OriginalAppleCatalogs'):
+ reposadocommon.print_stdout('OriginalAppleCatalogs:')
+ for catalog in products[key]['OriginalAppleCatalogs']:
+ reposadocommon.print_stdout(' %s', catalog)
+ reposadocommon.print_stdout('Branches:')
+ if branchlist:
+ for branch in branchlist:
+ reposadocommon.print_stdout(' %s', branch)
+ else:
+ reposadocommon.print_stdout(' ')
+ reposadocommon.print_stdout('HTML Description:')
+ reposadocommon.print_stdout(products[key]['description'])
+ else:
+ reposadocommon.print_stdout('No product id %s found.', key)
+
+
+def list_branches():
+ '''Prints catalog branch names'''
+ catalog_branches = reposadocommon.getCatalogBranches()
+ for key in catalog_branches.keys():
+ reposadocommon.print_stdout(key)
+
+
+def humanReadable(size_in_bytes):
+ """Returns sizes in human-readable units."""
+ units = [(" KB", 2**20), (" MB", 2**30), (" GB", 2**40), (" TB", 2**50)]
+ for suffix, limit in units:
+ if size_in_bytes > limit:
+ continue
+ else:
+ return str(round(size_in_bytes/float(limit/2**10), 1)) + suffix
+
+
+def print_product_line(key, products, catalog_branches=None):
+ '''Prints a line of product info'''
+ if key in products:
+ if not catalog_branches:
+ branchlist = ''
+ else:
+ branchlist = []
+ for branch in catalog_branches.keys():
+ if key in catalog_branches[branch]:
+ branchlist.append(branch)
+ deprecation_state = ''
+ if products[key].get('AppleCatalogs', []) == []:
+ deprecation_state = '(Deprecated)'
+ reposadocommon.print_stdout(
+ '%-15s %-50s %-10s %-10s %s %s',
+ key,
+ products[key]['title'],
+ products[key]['version'],
+ products[key]['PostDate'].strftime('%Y-%m-%d'),
+ branchlist,
+ deprecation_state)
+
+
+def list_branch(branchname, sort_order='date', reverse_sort=False):
+ '''List products in a given catalog branch'''
+ catalog_branches = reposadocommon.getCatalogBranches()
+ if branchname in catalog_branches:
+ list_products(sort_order, reverse_sort, catalog_branches[branchname])
+
+
+def list_deprecated(sort_order='date', reverse_sort=False):
+ '''Find products that are no longer referenced in Apple\'s catalogs'''
+ products = reposadocommon.getProductInfo()
+ list_of_keys = []
+ for product_key in products.keys():
+ if products[product_key].get('AppleCatalogs', []) == []:
+ list_of_keys.append(product_key)
+ list_products(sort_order, reverse_sort, list_of_keys)
+
+
+def list_products(sort_order='date', reverse_sort=False, list_of_keys=None):
+ '''Prints a list of Software Update products'''
+
+ def sort_by_key(a, b):
+ """Internal comparison function for use with sorting"""
+ return cmp(a['sort_key'], b['sort_key'])
+
+ sort_keys = {'date': 'PostDate',
+ 'title': 'title',
+ 'id': 'key'}
+
+ sort_key = sort_keys.get(sort_order, 'date')
+
+ products = reposadocommon.getProductInfo()
+ catalog_branches = reposadocommon.getCatalogBranches()
+ product_list = []
+ if list_of_keys == None:
+ list_of_keys = products.keys()
+ for key in list_of_keys:
+ product_dict = {}
+ product_dict['key'] = key
+ if sort_key == 'key':
+ product_dict['sort_key'] = key
+ else:
+ product_dict['sort_key'] = products[key][sort_key]
+ product_list.append(product_dict)
+ product_list.sort(sort_by_key)
+ if reverse_sort:
+ product_list.reverse()
+ for product in product_list:
+ print_product_line(product['key'], products, catalog_branches)
+
+
+def add_product_to_branch(parameters):
+ '''Adds one or more products to a branch. Takes a list of strings.
+ The last string must be the name of a branch catalog. All other
+ strings must be product_ids.'''
+ # sanity checking
+ for item in parameters:
+ if item.startswith('-'):
+ reposadocommon.print_stderr('Ambiguous parameters: can\'t tell if '
+ '%s is a parameter or option!', item)
+ return
+ branch_name = parameters[-1]
+ product_id_list = parameters[0:-1]
+ catalog_branches = reposadocommon.getCatalogBranches()
+ if not branch_name in catalog_branches:
+ reposadocommon.print_stderr('Catalog branch %s doesn\'t exist!',
+ branch_name)
+ return
+
+ products = reposadocommon.getProductInfo()
+ for product_id in product_id_list:
+ if not product_id in products:
+ reposadocommon.print_stderr(
+ 'Product %s doesn\'t exist!', product_id)
+ else:
+ title = products[product_id]['title']
+ vers = products[product_id]['version']
+ if product_id in catalog_branches[branch_name]:
+ reposadocommon.print_stderr(
+ '%s (%s-%s) is already in branch %s!',
+ product_id, title, vers, branch_name)
+ else:
+ reposadocommon.print_stdout(
+ 'Adding %s (%s-%s) to branch %s...',
+ product_id, title, vers, branch_name)
+ catalog_branches[branch_name].append(product_id)
+
+ reposadocommon.writeCatalogBranches(catalog_branches)
+ reposadocommon.writeAllBranchCatalogs()
+
+
+def remove_product_from_branch(parameters):
+ '''Removes one or more products from a branch. Takes a list of strings.
+ The last string must be the name of a branch catalog. All other
+ strings must be product_ids.'''
+
+ # sanity checking
+ for item in parameters:
+ if item.startswith('-'):
+ reposadocommon.print_stderr(
+ 'Ambiguous parameters: can\'t tell if '
+ '%s is a parameter or option!', item)
+ return
+
+ branch_name = parameters[-1]
+ product_id_list = parameters[0:-1]
+ catalog_branches = reposadocommon.getCatalogBranches()
+ if not branch_name in catalog_branches:
+ reposadocommon.print_stderr(
+ 'Catalog branch %s doesn\'t exist!', branch_name)
+ return
+ products = reposadocommon.getProductInfo()
+ for product_id in product_id_list:
+ if not product_id in products:
+ reposadocommon.print_stderr(
+ 'Product %s doesn\'t exist!', product_id)
+ title = products[product_id]['title']
+ vers = products[product_id]['version']
+ if not product_id in catalog_branches[branch_name]:
+ reposadocommon.print_stderr('%s (%s-%s) is not in branch %s!',
+ product_id, title, vers, branch_name)
+
+ reposadocommon.print_stdout('Removing %s (%s-%s) from branch %s...',
+ product_id, title, vers, branch_name)
+ catalog_branches[branch_name].remove(product_id)
+ reposadocommon.writeCatalogBranches(catalog_branches)
+ reposadocommon.writeAllBranchCatalogs()
+
+
+def copy_branches(source_branch, dest_branch):
+ '''Copies source_branch to dest_branch, replacing dest_branch'''
+ # sanity checking
+ for branch in [source_branch, dest_branch]:
+ if branch.startswith('-'):
+ reposadocommon.print_stderr(
+ 'Ambiguous parameters: can\'t tell if %s is a branch name or'
+ ' option!', branch)
+ return
+ catalog_branches = reposadocommon.getCatalogBranches()
+ if not source_branch in catalog_branches:
+ reposadocommon.print_stderr('Branch %s does not exist!', source_branch)
+ return
+ if dest_branch in catalog_branches:
+ answer = raw_input(
+ 'Really replace contents of branch %s with branch %s? [y/n] '
+ % (dest_branch, source_branch))
+ if not answer.lower().startswith('y'):
+ return
+ catalog_branches[dest_branch] = catalog_branches[source_branch]
+ reposadocommon.print_stdout('Copied contents of branch %s to branch %s.',
+ source_branch, dest_branch)
+ reposadocommon.writeCatalogBranches(catalog_branches)
+ reposadocommon.writeAllBranchCatalogs()
+
+
+def delete_branch(branchname):
+ '''Deletes a branch'''
+ catalog_branches = reposadocommon.getCatalogBranches()
+ if not branchname in catalog_branches:
+ reposadocommon.print_stderr('Branch %s does not exist!', branchname)
+ return
+ answer = raw_input('Really remove branch %s? [y/n] ' % branchname)
+ if answer.lower().startswith('y'):
+ del catalog_branches[branchname]
+ deleteBranchCatalogs(branchname)
+ reposadocommon.writeCatalogBranches(catalog_branches)
+
+
+def new_branch(branchname):
+ '''Creates a new empty branch'''
+ catalog_branches = reposadocommon.getCatalogBranches()
+ if branchname in catalog_branches:
+ reposadocommon.print_stderr('Branch %s already exists!', branchname)
+ return
+ catalog_branches[branchname] = []
+ reposadocommon.writeCatalogBranches(catalog_branches)
+
+
+def main():
+ '''Main command processing'''
+
+ p = optparse.OptionParser()
+ p.set_usage('''Usage: %prog [options]''')
+ p.add_option('--sync', action='store_true',
+ help="""Synchronize Apple updates""")
+ p.add_option('--products', '--updates', action='store_true',
+ dest='products',
+ help="""List available updates""")
+ p.add_option('--deprecated', action='store_true',
+ help="""List deprecated updates""")
+ p.add_option('--sort', metavar='SORT_ORDER', default='date',
+ help="""Sort list.
+ Available sort orders are: date, title, id""")
+ p.add_option('--reverse', action='store_true',
+ help="""Reverse sort order.""")
+ p.add_option('--branches', '--catalogs',
+ dest='list_branches', action='store_true',
+ help="""List available branch catalogs""")
+ p.add_option('--new-branch',
+ metavar='BRANCH_NAME',
+ help='''Create new empty branch BRANCH_NAME.''')
+ p.add_option('--delete-branch',
+ metavar='BRANCH_NAME',
+ help='''Delete branch BRANCH_NAME.''')
+ p.add_option('--copy-branch', nargs=2,
+ metavar='SOURCE_BRANCH DEST_BRANCH',
+ help='''Copy all items from SOURCE_BRANCH to
+ DEST_BRANCH. If DEST_BRANCH does not exist,
+ it will be created.''')
+ p.add_option('--list-branch', '--list-catalog',
+ dest='branch',
+ metavar='BRANCH_NAME',
+ help="""List updates in branch BRANCH_NAME""")
+ p.add_option('--product-info', '--info', metavar='PRODUCT_ID',
+ dest='info',
+ help="""Print info on a specific update.""")
+ p.add_option('--add-product', '--add-update', '--add',
+ dest='add_product', nargs=2,
+ metavar='PRODUCT_ID [PRODUCT_ID ...] BRANCH_NAME',
+ help='''Add one or more PRODUCT_IDs to catalog branch
+ BRANCH_NAME.''')
+ p.add_option('--remove-product', nargs=2,
+ metavar='PRODUCT_ID [PRODUCT_ID ...] BRANCH_NAME',
+ help='''Remove one or more PRODUCT_IDs from catalog branch
+ BRANCH_NAME.''')
+
+ options, arguments = p.parse_args()
+
+ if options.sync:
+ pass
+ #sync(fast_scan=True)
+ if options.products:
+ list_products(sort_order=options.sort, reverse_sort=options.reverse)
+ if options.deprecated:
+ list_deprecated(sort_order=options.sort, reverse_sort=options.reverse)
+ if options.branch:
+ list_branch(options.branch, sort_order=options.sort,
+ reverse_sort=options.reverse)
+ if options.list_branches:
+ list_branches()
+ if options.info:
+ get_info(options.info)
+ if options.new_branch:
+ new_branch(options.new_branch)
+ if options.copy_branch:
+ copy_branches(options.copy_branch[0], options.copy_branch[1])
+ if options.delete_branch:
+ delete_branch(options.delete_branch)
+ if options.add_product:
+ params = list(options.add_product)
+ params.extend(arguments)
+ add_product_to_branch(params)
+ if options.remove_product:
+ params = list(options.remove_product)
+ params.extend(arguments)
+ remove_product_from_branch(params)
+
+if __name__ == '__main__':
+ main()
\ No newline at end of file