package
stringlengths
1
122
pacakge-description
stringlengths
0
1.3M
accsetupparse
ACC Setup ParseThis is Python 3 library for converting and storing ACC json setup files into Python class.To install library usepipinstallaccsetupparseCurrently support carsMcLaren 720S GT3Change Log0.0.1 (17.2.2023.)Inital release0.0.2 (17.2.2023.)Fixed markdown files0.0.3 (17.2.2023.)Fixed CHANELOG.md missing in packaged fileRemove unnecessary files0.0.4 (17.2.2023.)Fixed unnecessary install_requires0.0.5 (23.2.2023.)Class now accepts python list with setup values, hence no longer logic for loading file into memoryAdded carTitle property which holds car name ment for displaying in UI0.0.6 (24.2.2023.)Fixed not passing setup to class method for converting values0.0.7 (24.2.2023.)Added tc2 and steering ratio property
accsr
No description available on PyPI.
accssctrl
accssctrlPython Attribute Access ControlFree software: BSD licenseDocumentation:https://accssctrl.readthedocs.io.FeaturesTODOCreditsThis package was created withCookiecutterand theaudreyr/cookiecutter-pypackageproject template.History0.1.0 (2022-08-10)First release on PyPI.
accsyn-python-api
accsyn-python-apiOfficial accsyn fast film delivery Python APIComplete Python API reference can be foundhere.Changelog:See doc/release_notes.rstDocumentation:https://accsyn-python-api.readthedocs.io/en/latestBuilding:To build the documentation locally, run:cd doc pip install -r requirements.txt python -m sphinx -T -E -b html -d _build/doctrees -D language=en . ../dist/docHenrik Norin, HDR AB, 2023 accsyn(r) - secure data delivery and workflow synchttps://accsyn.comhttps://support.accsyn.com
acct
acctSimple and secure account managementUSAGEYaml files containing confidential information can be encrypted for use inside ofacctbase applications. This is an example of what anacctcredentials file might look like.credentials.ymlprovider:profile_name:username:XXXXXXXXXXXXpassword:XXXXXXXXXXXXapi_key:XXXXXXXXXXXXXXXXXXXNext use theacctcommand to encrypt this file using the fernet algorithm:$acctencryptcredentials.ymlYeckEnWEGOjBDVxxytw13AsdLgquzhCtFHOs7kDsna8=Theacctcommand can also be used to decrypt the encrypted file:$acctdecryptcredentials.yml.fernet--output=yaml--acct-key="YeckEnWEGOjBDVxxytw13AsdLgquzhCtFHOs7kDsna8="The fernet plugin is the default for encryption, but other plugins may be added. To use the AES plugin change the previous commands to:$acctencrypt--crypto-pluginaesgcm256credentials.ymlYeckEnWEGOjBDVxxytw13AsdLgquzhCtFHOs7kDsna8=$acctdecrypt--crypto-pluginaesgcm256credentials.yml.aesgcm256--output=yaml--acct-key="YeckEnWEGOjBDVxxytw13AsdLgquzhCtFHOs7kDsna8="You can use theacctcommand to decrypt the acct file, open it in the default text editor, then overwrite the previous acct file: The default editor from the “EDITOR” environment variable will be used if it is set, otherwise, “notepad” for windows and “vi” for unix systems. It can also be specified directly with the “–editor” flag.$accteditcredentials.yml--acct-key="YeckEnWEGOjBDVxxytw13AsdLgquzhCtFHOs7kDsna8="What is POP?This project is built withpop, a Python-based implementation ofPlugin Oriented Programming (POP). POP seeks to bring together concepts and wisdom from the history of computing in new ways to solve modern computing problems.For more information:Intro to Plugin Oriented Programming (POP)pop-awesomepop-createGetting StartedPrerequisitesPython 3.6+git(if installing from source, or contributing to the project)InstallationNoteIf wanting to contribute to the project, and setup your local development environment, see theCONTRIBUTING.rstdocument in the source repository for this project.If wanting to useacct, you can do so by either installing from PyPI or from source.AcknowledgementsImg Shieldsfor making repository badges easy.
acct-backends
ACCT-BACKENDSINSTALLATIONacct-backendscan be installed via pip:pip install acct-backendsINSTALLATION FOR DEVELOPMENT1. Clone theacct-backendsrepository and install with pip:pip install -r requirements.txt2. Runpip install -e <path to provider>from your project’s root directoryYou are now fully set up to begin developing acct plugins.USEAfter installation new acct backends can be specified in your encrypted acct profile:acct-backend: lastpass: username: [email protected] password: password designator: acct-provider- keybase: username: user password: password
acctext
Python wrapper for Accelerated TextInstallation$ python -m pip install acctextUsagefromacctextimportAcceleratedTextat=AcceleratedText(host='http://127.0.0.1:3001')Make sure Accelerated Text application is running. Refer todocumentationfor launch instructions.at.health(){'health': 'Ok'}Interacting with Dictionaryitems=[{'key':'house','category':'N','forms':['house','houses']},{'key':'hill','category':'N','forms':['hill','hills']},{'key':'on','forms':['on'],'category':'Prep','attributes':{'Operation':'Syntax.on_Prep/Prep'}},{'key':'the','forms':['the'],'category':'Det','attributes':{'Operation':'Syntax.the_Det/Det'}}]foriteminitems:at.create_dictionary_item(**item)items=at.list_dictionary_items()items[{'id': 'the_Det_Eng', 'key': 'the', 'forms': ['the'], 'category': 'Det', 'language': 'Eng', 'attributes': {'Operation': 'Syntax.the_Det/Det'}}, {'id': 'hill_N_Eng', 'key': 'hill', 'forms': ['hill', 'hills'], 'category': 'N', 'language': 'Eng', 'attributes': {}}, {'id': 'house_N_Eng', 'key': 'house', 'forms': ['house', 'houses'], 'category': 'N', 'language': 'Eng', 'attributes': {}}, {'id': 'on_Prep_Eng', 'key': 'on', 'forms': ['on'], 'category': 'Prep', 'language': 'Eng', 'attributes': {'Operation': 'Syntax.on_Prep/Prep'}}]Working with DataUpload a local fileat.upload_data_file('example_data.csv'){'message': 'Succesfully uploaded file', 'id': 'example_data.csv'}Create a data file from scratchat.create_data_file('example_data_2.csv',['a','b'],[['1','2'],['3','4']]){'id': 'example_data_2.csv'}List available data files[x['id']forxinat.list_data_files()]['example_data.csv', 'example_data_2.csv']Fetch data fileat.get_data_file('example_data_2.csv'){'id': 'example_data_2.csv', 'filename': 'example_data_2.csv', 'header': ['a', 'b'], 'rows': [['1', '2'], ['3', '4']]}Delete data fileat.delete_data_file('example_data_2.csv'){'message': 'Succesfully deleted file', 'id': 'example_data_2.csv'}Languages and ReadersFetch existing language propertiesat.get_language('Eng'){'id': 'Eng', 'name': 'English', 'flag': '🇬🇧', 'default': True}Add new languageat.add_language('Ger','German'){'id': 'Ger', 'name': 'German', 'flag': '🇩🇪', 'default': False}List available languagesat.list_languages()[{'id': 'Eng', 'name': 'English', 'flag': '🇬🇧', 'default': True}, {'id': 'Ger', 'name': 'German', 'flag': '🇩🇪', 'default': False}]Create new reader typeat.create_reader('Dc','Discount Customer','(DC)'){'id': 'Dc', 'name': 'Discount Customer', 'flag': '(DC)', 'default': False}at.create_reader('Lc','Loyal Customer','(LC)'){'id': 'Lc', 'name': 'Loyal Customer', 'flag': '(LC)', 'default': False}List available readersat.list_readers()[{'id': 'Dc', 'name': 'Discount Customer', 'flag': '(DC)', 'default': False}, {'id': 'Lc', 'name': 'Loyal Customer', 'flag': '(LC)', 'default': False}]Document plansOpen Accelerated Text document plan editor (http://127.0.0.1:8080by default) and create a new document plan named"House description". More detailed instructions can be found indocumentation.Fetch single document plandp=at.get_document_plan(name='House description')dp['documentPlan']{'type': 'Document-plan', 'segments': [{'children': [{'modifier': {'name': 'size', 'type': 'Cell-modifier', 'srcId': 'B-D0i/`TL4@ja%{U!?2G', 'child': {'name': 'color', 'type': 'Cell-modifier', 'srcId': '!2b?}PBIB?i]%*/(~?XM', 'child': {'name': 'house', 'type': 'Dictionary-item', 'srcId': '+5JLY;_/2/zEOcZ._$,4', 'kind': 'N', 'itemId': 'house_N_Eng'}}}, 'type': 'Modifier', 'srcId': '`62!swypAqp_jK_lr1Ow', 'child': {'name': 'on', 'type': 'Dictionary-item-modifier', 'srcId': ']MNfAFBjxy,c?G55a04@', 'kind': 'Prep', 'child': {'name': 'the', 'type': 'Dictionary-item-modifier', 'srcId': '62%#$13DP}Gj8=n4NCI.', 'kind': 'Det', 'child': {'name': 'hill', 'type': 'Dictionary-item', 'srcId': 'Ol68tPXKblg(pUghVhb@', 'kind': 'N', 'itemId': 'hill_N_Eng'}, 'itemId': 'the_Det_Eng'}, 'itemId': 'on_Prep_Eng'}}], 'type': 'Segment', 'srcId': ']H[rfMhNu,^(wX6[%.+w'}], 'srcId': 'Li$gv+b_9o-n$z^FnSl~'}Delete document planat.delete_document_plan(dp['id'])TrueRestore document planat.create_document_plan(**dp)['name']'House description'List document plans[x['name']forxinat.list_document_plans(kind='Document')]['House description']Text generationresult=at.generate('House description',data={"size":"small","color":"red"})result['variants']['Small red house on the hill.']Bulk generationresults=at.generate_bulk('House description',data=[{"size":"small","color":"red"},{"size":"big","color":"green"}])[x['variants']forxinresults][['Small red house on the hill.'], ['Big green house on the hill.']]Fetch specific resultat.get_result(result['resultId']){'resultId': 'a364335f-5021-443d-9c77-fe40c296ecef', 'offset': 0, 'totalCount': 1, 'ready': True, 'updatedAt': 1628173135, 'variants': ['Small red house on the hill.']}Working with stateExportat.export_state('state.zip')Clearat.clear_state()Restoreat.restore_state('state.zip')
acctf
acctfEnglish |日本語This is a library that obtains deposit/withdrawal history, price and quantity of held stocks from bank and securities accounts.Currently, it supports the following.SecuritiesSBI SecuritiesYen-denominatedStockscash/specified depositFundsspecified depositNISA deposit(accumulated investment limit)Old accumulated NISA depositForeign-denominatedStocks(Only US)cash/specified depositBankMizuho BankBalance(Only Yen)Transaction historySBI Net BankBalance(Include hybrid deposit)(Only Yen)Transaction history(Include hybrid deposit)(Only Yen)OtherWealthNaviEach valuationHow to useInstallationpip install acctfExampleSecuritiesfromacctf.securities.sbiimportSBIsbi=SBI().login("<ユーザID>","<パスワード>")stocks=sbi.get_stock_specific()print("銘柄, 数量, 取得単価, 現在値")forsinstocks:print(f"{s.name},{s.amount},{s.acquisition_value},{s.current_value}")sbi.close()銘柄, 数量, 取得単価, 現在値0000 銘柄1, 1000, 1234, 23451111 銘柄2, 1500, 789, 9872222 銘柄3, 2000, 3450, 3456BankBalancefromacctf.bank.mizuhoimportMizuhomizuho=Mizuho().login("<ユーザID>","<パスワード>")b=mizuho.get_balance("7654321")print(f"口座番号, 店舗, 残高, 口座タイプ")print(f"{b[0].account_number},{b[0].branch_name},{b[0].value},{b[0].deposit_type}")mizuho.close()口座番号, 店舗, 残高, 口座タイプ7654321, 本店, 1234567.0, DepositType.ordinaryTransaction historyfromacctf.bank.mizuhoimportMizuhomizuho=Mizuho().login("<ユーザID>","<パスワード>")hist=mizuho.get_transaction_history("7654321")# You can also specify the start/end date.# hist = mizuho.get_transaction_history("7654321", date(2023, 12, 1), date(2023, 12, 31))print(f"日付, 取引内容, 金額")forhinhist:print(f"{h.date},{h.content},{h.value}")mizuho.close()日付, 取引内容, 金額2023-12-01, ATM引き出し, -10000.02024-12-20, 給与, 200000.0OtherWealthNavifromacctf.other.wealthnaviimportWealthNaviw=WealthNavi().login("<ユーザID>","<パスワード>","<ワンタイムパスワード>")# If you don't set the One Time Password# w = WealthNavi().login("<ユーザID>", "<パスワード>")print("資産クラス, 現在価格, 損益")forhinw.get_valuation():print(f"{h.name},{h.value},{h.pl_value}")w.close()資産クラス, 現在価格, 損益米国株(VTI), 123456.0, 12345.0日欧株(VEA), 123456.0, 12345.0新興国株(VWO), 123456.0, 12345.0債券(AGG), 123456.0, 12345.0金(GLD), 123456.0, 12345.0金(IAU), 123456.0, 12345.0不動産(IYR), 123456.0, 12345.0現金, 123456.0, 0.0
acctools
This is a simple package for elegant preprocessing procedure
accu
Django AccuMore batteries for Django - an opinionated package for creating a Django application faster.ArchitectureThis project is not flexible in the supported architecture, it makes clear assumptions about the used structure. It uses:Python >=3.9Django >=4.1DRF => 3.14Django Q => 1.3Sentry (optional)Provided namespaces:Currently availablehttps://github.com/matmair/django-accu/issues/1core.checks: checks to make sure the Django project confirms to core assumptions madehttps://github.com/matmair/django-accu/issues/2core.config: utils to load settings from different sources (environment, .env, config files, ...)https://github.com/matmair/django-accu/issues/3core.clean: clean user inputs with bleachhttps://github.com/matmair/django-accu/issues/4core.urls: import for this project URLshttps://github.com/matmair/django-accu/issues/5core.utils: helper functionshttps://github.com/matmair/django-accu/issues/6tasks: async execution of functionsOn the roadmaphttps://github.com/matmair/django-accu/issues/7plugins: a plugin implementation for Djangohttps://github.com/matmair/django-accu/issues/8settings: database settingshttps://github.com/matmair/django-accu/issues/8settings.api: expose the settings for consumption in apps/frontendsTODO/to be ported from codebasesusers: expandable user (uses a profile to expand the model to keep usability with other projects high)changelogsapprovalssetupscore.updatereports.client (jasper-server-based)reports.file (jasper-file-basedTo be finished and ported:users.token: multi-token implementation for usage with DRFcore.security.api: expose all core security flows via API (registration, login, logoff, deletion, MFA-actions, password-reset)WARNINGTHIS IS CURRENTLY A PLACEHOLDER WHILE SOME HURDLES ARE CLEAREDAcknowledgementThis repo contains code from InvenTree/InvenTree (MIT licensed), MissionLog (closed source, with permission) and several closed source projects I (matmair / mjmair.com) work on with the permission of the owners.
accuasset
accuasset python packageaccuassetis Accuinsight Asset Python package that provides Data Science simpler.
accubib
Failed to fetch description. HTTP Status Code: 404
accubits
Accubits CLIA command line tool to create web apps using boiler plate codes.InstallationInstall python 3 in your system and run the below command$pipinstallaccubitscliCreate an Angular app with Boilerplate$accubitscreatengyourangularappname
accudata
Do you collect a heterogeneous data step by step?Here you find a convinient solution of this problem. class AccumulativeData provides a simple interface to store data step by step. The data can be consisted of:NumbersLists / arraysObjectsYou can store it as pickled object or Pandas Dataframe.InstallationThe module can be installed from pippip install accudataFor exampleYou have a social data collecting process. You must collect on every step heterogeneous data:Name of a personAgeInterestsPreferences by categories: food, pets, sport, politicsYou can make a class:fromaccudataimportAccumulativeDataclassPeopleAccData(AccumulativeData):def__init__(self):lists=['name','age','interests']dicts={'pref':['food','pets','sport','politics]}super().__init__(lists=lists,dicts=dicts)After that you can make an iterative collecting process as follows:Data=PeopleAccData()foriteminraw_data:Data.next()# \\\ A complicated code to extract dataname,age,interests,food,pets,sport,politics,_=extract_data(item)Data.append(name,age,interests,pref=[food,pets,sport,politics])It is simple to get data:names=Data.name# Make the dataframedataframe=Data.todf()print(dataframe.name)
accumulate
accumulatePackageaccumulateeases inheritance of iterable class attributes by accumulating values along the MRO.
accumulation-tree
A red/black tree which also stores partial aggregations at each node, making getting aggregations of key range slices an O(log(N)) operation.
accumulator
Copyright (c) 2015 Rafael da Silva Rochahttps://github.com/rochars/accumulatorInstallation$ pip install accumulatorCompatibilityAccumulator is compatible with Python 2.7, 3.3, 3.4 and 3.5.VersionThe current version is 0.3 alpha.LicenseCopyright (c) 2015 Rafael da Silva RochaPermission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
accupy
Accurate sums and (dot) products for Python.SumsSumming up values in a list can get tricky if the values are floating point numbers; digit cancellation can occur and the result may come out wrong. A classical example is the sum1.0e16 + 1.0 - 1.0e16The actual result is1.0, but in double precision, this will result in0.0. While in this example the failure is quite obvious, it can get a lot more tricky than that. accupy providesp,exact,cond=accupy.generate_ill_conditioned_sum(100,1.0e20)which, given a length and a target condition number, will produce an array of floating point numbers that is hard to sum up.Given one or two vectors, accupy can compute the condition of the sum or dot product viaaccupy.cond(x)accupy.cond(x,y)accupy has the following methods for summation:accupy.kahan_sum(p):Kahan summationaccupy.fsum(p): A vectorization wrapper aroundmath.fsum(which uses Shewchuck's algorithm[1](see alsohere)).accupy.ksum(p, K=2): Summation in K-fold precision (from[2])All summation methods sum the first dimension of a multidimensional NumPy array.Let's compare them.Accuracy comparison (sum)As expected, the naivesumperforms very badly with ill-conditioned sums; likewise fornumpy.sumwhich uses pairwise summation. Kahan summation not significantly better;this, too, is expected.Computing the sum with 2-fold accuracy inaccupy.ksumgives the correct result if the condition is at most in the range of machine precision; further increasingKhelps with worse conditions.Shewchuck's algorithm inmath.fsumalways gives the correct result to full floating point precision.Runtime comparison (sum)We compare more and more sums of fixed size (above) and larger and larger sums, but a fixed number of them (below). In both cases, the least accurate method is the fastest (numpy.sum), and the most accurate the slowest (accupy.fsum).Dot productsaccupy has the following methods for dot products:accupy.fdot(p): A transformation of the dot product of lengthninto a sum of length2n, computed withmath.fsumaccupy.kdot(p, K=2): Dot product in K-fold precision (from[2])Let's compare them.Accuracy comparison (dot)accupy can construct ill-conditioned dot products withx,y,exact,cond=accupy.generate_ill_conditioned_dot_product(100,1.0e20)With this, the accuracy of the different methods is compared.As for sums,numpy.dotis the least accurate, followed by instanced ofkdot.fdotis provably accurate up into the last digitRuntime comparison (dot)NumPy'snumpy.dotismuchfaster than all alternatives provided by accupy. This is because the bookkeeping of truncation errors takes more steps, but mostly because of NumPy's highly optimized dot implementation.ReferencesRichard Shewchuk,Adaptive Precision Floating-Point Arithmetic and Fast Robust Geometric Predicates, J. Discrete Comput. Geom. (1997), 18(305), 305–363Takeshi Ogita, Siegfried M. Rump, and Shin'ichi Oishi,Accurate Sum and Dot Product, SIAM J. Sci. Comput. (2006), 26(6), 1955–1988 (34 pages)Dependenciesaccupy needs the C++Eigen library, provided in Debian/Ubuntu bylibeigen3-dev.Installationaccupy isavailable from the Python Package Index, so withpip install accupyyou can install.TestingTo run the tests, just check out this repository and typeMPLBACKEND=Agg pytest
accuracy
accuraCyIt's pronounced "accura-see". For spaCy models.The goal of this project is to generate reports forspaCymodels.what it doesThe goal ofaccuraCyis to offer static reports for spaCy models that help users make better decisions on how the models can be used. At the moment the project supports reports for threshold values for classification.Here's a preview of what to expect:There are two kinds of charts.The first kind is a density chart. This chart shows the distribution of confidence scores for a given class. The blue area represents documents that had the tag assigned to the class. The orange area represents documents that didn't.The second kind is a line chart that demonstrates the accuracy, precision and recall values for a given confidence threshold. It's an interactive chart and you can explore the values by hovering over the chart.installYou can install the latest version from git.python -m pip install "accuracy @ git+https://github.com/koaning/accuracy.git"usageTheaccuracyproject provides a command line interface that can generate reports. The full CLI can also be explored via the--helpflag.> python -m accuracy --help Usage: python -m accuracy [OPTIONS] COMMAND [ARGS]... It's pronounced 'accura-see'. For spaCy models. Options: --help Show this message and exit. Commands: report Generate a model report. version Show version number.accuracy reportThe most important command is thereportcommand. You'd typically use it via a command similar to:> python -m accuracy report training/model-best/ corpus/train.spacy corpus/dev.spacy Loading model at training/model-best Running model on training data... Running model on development data... Generating Charts ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 100% 0:00:00 Done! You can view the report via; python -m http.server --directory reports PORTThis will generate a folder, typically called "reports", that contains a full dashboard for the trained spaCy model found intraining/model-best.The CLI has a few configurable settings:Arguments: [MODEL_PATH] Path to spaCy model [TRAIN_PATH] Path to training data [DEV_PATH] Path to development data [FOLDER_OUT] Output folder for reports [default: reports] Options: --classes TEXT Comma-separated string of classes to use --help Show this message and exit.
accurate-timed-loop
Accurate Timed LoopThis is a python module that provide a way to have an acccurate timed loop.For example if you need to do an activity every 250ms +/-10ms, this loop will do that.Sample codesee sample.py for a full exampleimportaccurate_timed_looploop_delay=0.250# secondstotal_wait=25.0# secondsforelapsed,start_timeinaccurate_timed_loop.accurate_wait(total_wait,loop_delay):# ... do task every 250 mSpassAccuracy and LimitationsThe sample.py does testing and shows that on Windows MSYS2 the std deviation error is roughly 4mS in a 250mS loop. This means that 95% of loops will be +/-8 mS of the requested loop_delay.expected elapsed diff1(ms) actual(s) diff2(ms) 1 0.000000 0.000000 0.000 0.000000 0.000 2 0.250000 0.257294 7.294 0.257294 7.294 <snip> 100 24.750000 24.764093 14.093 24.764093 14.093 101 25.000000 25.015579 15.579 25.015579 15.579 Stats: loop count : 101 loops Error Range : 0.000 to 24.406 mS Error Stddev : 5.009 mS Error Average : 8.863 mS Recommended adj: 0.012200 sample rc=0 doit overall rc=0This value is specific to Windows and to the PC that it is running on.To make it more accurate for your PC and OS use the fixed_adjustment parameter. Set it so the minimum and maximum are roughly symmetrical around 0. The Stdev and Average error values at that point should be miminal.importaccurate_timed_looploop_delay=0.250# secondstotal_wait=25.0# secondsadj=0.009228# macosforelapsed,start_timeinaccurate_timed_loop.accurate_wait(total_wait,loop_delay,fixed_adjustment=adj):# ... do task every 250 mSpassNotes:Re-run this several times, and tweak the fixed adjustment.The sample.py reports a "Recommended adj" that usually results in better accuracy.Macos and Ubuntu tend to be less variant than WindowsThis report shows that std deviation is much better.expected elapsed diff1(ms) actual(s) diff2(ms) 1 0.000000 0.000000 0.000 0.000000 0.000 2 0.250000 0.251537 1.537 0.251537 1.537 <snip> 101 25.000000 24.989502 -10.498 24.989502 -10.498 102 25.250000 25.241386 -8.614 25.241386 -8.614 Stats: loop count : 102 loops Error Range : -9.228 to 5.864 mS Error Stddev : 1.238 mS Error Average : 4.953 mS Recommended adj: 0.009228 sample rc=0 doit overall rc=0Limitations:there is NO guarantee that the average error will always be that low or that consistentthe following runs were on a Macos# === first run: Stats: loop count : 102 loops Error Range : -9.486 to 4.613 mS Error Stddev : 1.962 mS Error Average : 5.775 mS Recommended adj: 0.009486 # === second run: Stats: loop count : 102 loops Error Range : -9.587 to 3.287 mS Error Stddev : 2.163 mS Error Average : 6.745 mS Recommended adj: 0.009587 # === third run: Stats: loop count : 102 loops Error Range : -9.472 to 6.782 mS Error Stddev : 1.546 mS Error Average : 5.597 mS Recommended adj: 0.009472 # === fourth run: Stats: loop count : 101 loops Error Range : -9.518 to 10.365 mS Error Stddev : 1.865 mS Error Average : 5.410 mS Recommended adj: 0.009518 # === fifth run: Stats: loop count : 101 loops Error Range : -9.369 to 13.726 mS Error Stddev : 2.196 mS Error Average : 5.614 mS Recommended adj: 0.009369if you use the adj parameter the incoming "elapsed" parameter will not be after your expected delay. For example these two came in:at 24.749 seconds instead of the expected 24.750 secondsat 24.999 seconds instead of the expected 25.000 secondsexpected elapsed diff1(ms) actual(s) diff2(ms) 100 24.750000 24.749573 -0.427 24.749573 -0.427 101 25.000000 24.999601 -0.399 24.999601 -0.399
accurating
AccuRatingLibrary computing accurate ratings based on match results.DocumentationPackageUsage exampleIs AccuRating accurate?Yes. With the same data, AccuRating will return much more accurate ratings than chess ELO or EGD. For instance if a player Leon manages to win against player Caesar, Leon will "get" some X points for that victory. If later Caesar will win against strong players Alusia and EmCee, his rating will increase, but that would not affect Leon's rating in chess or EGD. If we want to use data efficienty Leon's rating should be adjusted because Caesar clearly demonstrated his power and it was not long ago that poor Leon lucked out a victory against Caesar.EGD, Chess ELO go over the data once, so they have no way of taking that into account. AccuRating shows that we need about 1500 - 2000 passes over the data to converge to accurate ratings.AccuRating is a variant ofWhole-History-Rating, which is also used athttps://www.goratings.org/.Interesting discussionon why WHR is good.What AccuRating ratings numbers mean exactly?Only differences matter, i.e. adding say 100 to all the ratings would yield equally valid points.100 AccuRating point difference is 1:2 win odds (33% vs 66%)200 AccuRating point difference is 1:5 win odds (20% vs 80%)300 AccuRating point difference is 1:9 win odds (11% vs 89%)The exact formula is $P(win) = 1 / (1 + 2^{d / 100})$. Optimization algorithm find ratings that maximize probability of the data.Compared to Chess ELOChess ELO is similar, but the points are rescaled by 1.20412:120.412 chess ELO difference for 1:2 win odds (33% vs 66%)240.824 chess ELO difference for 1:5 win odds (20% vs 80%)361.236 chess ELO difference for 1:9 win odds (11% vs 89%)The Chess ELO formula is $P(win) = 1 / (1 + 10^{d / 400})$Compared to EGDIn EGD, winning odds for 100 points of rating is not fixed.This is beacuse: 1 dan/kyu difference = 100 EGD = 1 handicap. The nature of Go is that 1 handicap (i.e. 100 EGD) means more on a dan level than on a kyu level.On the dan level:90 EGD point difference is approximately 1:2 win odds (33% vs 66%)180 EGD point difference is approximately 1:5 win odds (20% vs 80%)270 EGD point difference is approximately 1:9 win odds (11% vs 89%)On the kyu level:300 EGD point difference is approximately 1:2 win odds (33% vs 66%)600 EGD point difference is approximately 1:5 win odds (20% vs 80%)900 EGD point difference is approximately 1:9 win odds (11% vs 89%)Based on these tables.If AccuRating is so accurate why other systems are not using it?Typical ELO systems (like EGD) use every game result only once and update rating based on it. This model can do 1500 passes over the data until it converged (the first pass returns numbers more or less like standard ELO system, equations are almost the same). However so many passes is too expensive when the number of games is as big as EGD.What it AccuRating bad for?This system does not have any gamification incentives so it is bad to player motivation. It uses data efficiently and nicely approximates true strength. It can be compared to hidden MMR used for match-making in games like Starcraft 2, not to the player-visible motivating points with various "bonus points".Model detailsThe model finds a single number (ELO strength) for each player. Given ELO of two players, the model predicts probability of win in a game: If we assume that $P$ and $Q$ are rankings of two players, the model assumes:$$P(\text{P vs Q win}) = \frac{2^P}{2^P + 2^Q} = \frac{1}{1 + 2^{Q-P}} $$This means that if both rankings are equal to $a$, then: $P(win) = \frac{2^a}{2^a+2^a} = 0.5$. If a ranking difference is one point, we have $P(win) = \frac{2^{a+1}}{2^{a+1}+2^{a}} = \frac{2}{2+1}$ Two point adventage yields $P(win) = \frac{1}{5}$ $n$ point adventage yields $P(win) = \frac{1}{1+2^n}$For readability reasons we rescale the points by 100. This is exactly equivalent to using this equation:$$ \frac{1}{1 + 2^{\frac{Q-P}{100}}} $$Comparison to single-pass systems.In single pass systems, if you play a game, it will not affect the model estimation of your rating yesterday. In multi-pass system we can estimate every player's rating for every season (or even every day). Then iteratively pass over the data again and again until we find rating curves that best fit the data.There is a parameter that controlls how fast the rating can change. WHR model assumes that player rating is a gaussian process in time, and this parameter is variance of this gaussian process.The consequence is that data flows both time directions: if you play game in season 20, it will also affect your ratings for season 19 (a bit) and season 18 (very small bit) etc. The data also flows over the whole graph of games and players on each iteration.Can I convert these ratings to EGD?Because EGD is using different exponent base, it is not that easy to convert directly. These is a monotonic conversion function but it is non-linear, and it would take some work to derive the formula.It would be interesting to plot EGD ratings against AccuRating ratings.What's implementedThis repo implements:ImplementBradley-Terry (BT) modelfor player ranking (better variant of ELO).Generalization BT model to a variant ofWHR modelwith time as seasons.ToDo and ideasdistribution fit to account for heavy tailChess unitsEGD conversionFit player variance (high variance player can more easily win against stonger players and more easily lose against weaker players)Follow this:https://www.marwandebbiche.com/posts/python-package-tooling/Developmentgitconfig--globalcore.hooksPath.githooks/gitclonehttps://github.com/lukaszlew/accurating poetryinstall poetryrunpytest poetryrunmypy.# edit stuff; increase versionpoetrybuild poetrypublish
accure-line-seg
Failed to fetch description. HTTP Status Code: 404
accure-ocr-lineseg
Failed to fetch description. HTTP Status Code: 404
accure-ocr-seg
Failed to fetch description. HTTP Status Code: 404
accuri2fcs
Accuri2fcs is a command line program for the conversion of accura .c6 formatted flow cytometry files to standard .fcs. In practise this is relatively straightforward as Accuri files are simply zip structures containing multiple .fcs. in a structured format. However, extracting and naming these individual files is a little more tricky. This program allows rapid batch processing of multiple Accura files into multiple .fcs files, with regular expression sample name matching, splitting and copying to build the output filenames. It can get quite complicated, see the examples.
accustom
CloudFormation AccustomCloudFormation Accustom is a library for responding to Custom Resources in AWS CloudFormation using the decorator pattern.This library provides a cfnresponse method, some helper static classes, and some decorator methods to help with the function.InstallationCloudFormation Accustom can be found under PyPI athttps://pypi.python.org/pypi/accustom.To install:python3-mpipinstallaccustomTo create a Lambda Code Bundle in Zip Format with CloudFormation Accustom and dependencies (includingrequests), create a directory with only your code in it and run the following. Alternatively you can create a Lambda Layer with CloudFormation Accustom and dependencies installed and use that as your base layer for custom resources.python3-mpipinstallaccustom-t. zipcode.zip*-rQuickstartThe quickest way to use this library is to use the standalone [email protected], in a Lambda [email protected](expectedProperties=['key1','key2'])defresource_handler(event,context):result=(float(event['ResourceProperties']['key1'])+float(event['ResourceProperties']['key2']))return{'sum':result}In this configuration, the decorator will check to make sure the propertieskey1andkey2have been passed by the user, and automatically send a response back to CloudFormation based upon theeventobject.As you can see, this greatly simplifies the developer effort required to get a working custom resource that will correctly respond to CloudFormation Custom Resource Requests.The Decorator PatternsThe most important part of this library are the Decorator patterns. These provide Python decorators that can be put around handler functions, or resource specific functions, that prepare the data for ease of usage. These decorators will also handle exceptions for [email protected]()This is the primary decorator in the library. The purpose of this decorator is to take the return value of the handler function, and respond back to CloudFormation based upon the inputeventautomatically.It takes the following options:enforceUseOfClass(Boolean) : When this is set toTrue, you must use aResponseObject. This is implicitly set to true if no Lambda Context is provided.hideResourceDeleteFailure(Boolean) : When this is set toTruethe function will returnSUCCESSeven on getting an Exception forDeleterequests.redactConfig(accustom.RedactionConfig) : For more details on how this works please see "Redacting Confidential Information From Logs"timeoutFunction(Boolean): Will automatically send a failure signal to CloudFormation before Lambda timeout provided that this function is executed in Lambda.Without aResponseObjectthe decorator will make the following assumptions:if a Lambda Context is not passed, the function will returnFAILEDif a dictionary is passed back, this will be used for the Data to be returned to CloudFormation and the function will returnSUCCESS.if a string is passed back, this will be put in the return attributeReturnand the function will returnSUCCESS.ifNoneorTrueis passed back, the function will returnSUCCESSifFalseis passed back, the function will [email protected]()This decorator, known as the "Resource Decorator" is used when you break the function into different resources, e.g. by making a decision based upon whichResourceTypewas passed to the handler and calling a function related to that resource.It takes the following option:decoratorHandleDelete(Boolean) : When set toTrue, if aDeleterequest is made ineventthe decorator will return aResponseObjectwith a withSUCCESSwithout actually executing the decorated function.genUUID(Boolean) : When set toTrue, if thePhysicalResourceIdin theeventis not set, automatically generate a UUID4 and put it in thePhysicalResoruceIdfield.expectedProperties(Array or Tuple) : Pass in a list or tuple of properties that you want to check for before running the decorated function. If any are missing, returnFAILED.The most useful of these options isexpectedProperties. With it is possible to quickly define mandatory properties for your resource and fail if they are not [email protected]()This decorator is just a combination [email protected]()[email protected](). This allows you to have a single, stand-alone resource handler that has some defined properties and can automatically handle delete. The options available to it is the combination of both of the options available to the other two Decorators, except forredactPropertieswhich takes an accustom.StandaloneRedactionConfig object instead of an accustom.RedactionConfig object. For more information onredactPropertiessee "Redacting Confidential Information From Logs".The other important note about combining these two decorators is thathideResourceDeleteFailurebecomes redundant ifdecoratorHandleDeleteis set toTrue.Response Function and ObjectThecfnresponse()function and theResponseObjectare convenience function for interacting with CloudFormation.cfnresponse()cfnresponse()is a traditional function. At the very minimum it needs to take in theeventand a status,SUCCESSorFAILED. In practice this function will likely not be used very often outside the library, but it is included for completeness. For more details look directly at the source code for this function.ResponseObjectTheResponseObjectallows you to define a message to be sent to CloudFormation. It only has one method,send(), which uses thecfnresponse()function under the hood to fire the event. A response object can be initialised and fired with:importaccustomdefhandler(event,context):r=accustom.ResponseObject()r.send(event)If you are using the decorator pattern it is strongly recommended that you do not invoke thesend()method, and instead allow the decorator to process the sending of the events for you by returning from your function.To construct a response object you can provide the following optional parameters:data(Dictionary) : data to be passed in the response. Must be a dict if usedphysicalResourceId(String) : Physical resource ID to be used in the responsereason(String) : Reason to pass back to CloudFormation in the response ObjectresponseStatus(accustom.Status): response Status to use in the response Object, defaults toSUCCESSsquashPrintResponse(Boolean) : InDEBUGlogging the function will often print out theDatasection of the response. If theDatacontains confidential information you'll want to squash this output. This option, when set toTrue, will squash the output.Logging RecommendationsThe decorators utilise thelogginglibrary for logging. It is strongly recommended that your function does the same, and sets the logging level to at leastINFO. Ensure the log level is setbeforeimporting Accustom.importlogginglogger=logging.getLogger(__name__)logging.getLogger().setLevel(logging.INFO)importaccustomRedacting Confidential Information FromDEBUGLogsIf you often pass confidential information like passwords and secrets in properties to Custom Resources, you may want to prevent certain properties from being printed to debug logs. To help with this we provide a functionality to either blocklist or allowlist Resource Properties based upon provided regular expressions.To utilise this functionality you must initialise and include aRedactionConfig. ARedactionConfigconsists of some flags to define the redaction mode and if the response URL should be redacted, as well as a series ofRedactionRuleSetobjects that define what to redact based upon regular expressions. There is a special case ofRedactionConfigcalled aStandaloneRedactionConfigthat has one, and only one,RedactionRuleSetthat is provided at initialisation.EachRedactionRuleSetdefines a single regex that defines which ResourceTypes this rule set should be applied too. You can then apply any number of rules, based upon an explicit property name, or a regex. Please see the definitions, and an example below.RedactionRuleSetTheRedactionRuleSetobject allows you to define a series of properties or regexes which to allowlist or blocklist for a given resource type regex. It is initialised with the following:resourceRegex(String) : The regex used to work out what resources to apply this too.add_property_regex(propertiesRegex)propertiesRegex(String) : The regex used to work out what properties to allowlist/blocklistadd_property(propertyName)propertyName(String) : The name of the property to allowlist/blocklistRedactionConfigTheRedactionConfigobject allows you to create a collection ofRedactionRuleSetobjects as well as define what mode (allowlist/blocklist) to use, and if the presigned URL provided by CloudFormation should be redacted from the logs.redactMode(accustom.RedactMode) : What redaction mode should be used, if it should be a blocklist or allowlistredactResponseURL(Boolean) : If the response URL should be not be logged.add_rule_set(ruleSet)ruleSet(accustom.RedactionRuleSet) : The rule set to be added to the RedactionConfigStandaloneRedactionConfigTheStandaloneRedactionConfigobject allows you to apply a singleRedactionRuleSetobject as well as define what mode (allowlist/blocklist) to use, and if the presigned URL provided by CloudFormation should be redacted from the logs.redactMode(accustom.RedactMode) : What redaction mode should be used, if it should be a blocklist or allowlistredactResponseURL(Boolean) : If the response URL should be not be logged.ruleSet(accustom.RedactionRuleSet) : The rule set to be added to the RedactionConfigExample of RedactionThe below example takes in two rule sets. The first ruleset applies to all resources types, and the second ruleset applies only to theCustom::Testresource type.All resources will have properties calledTestandExampleredacted and replaced with[REDATED]. TheCustom::Testresource will also additionally redact properties calledCustomand those thatstart withDeleteMe.Finally, asredactResponseURLis set toTrue, the response URL will not be printed in the debug logs.fromaccustomimportRedactionRuleSet,RedactionConfig,decoratorruleSetDefault=RedactionRuleSet()ruleSetDefault.add_property_regex('^Test$')ruleSetDefault.add_property('Example')ruleSetCustom=RedactionRuleSet('^Custom::Test$')ruleSetCustom.add_property('Custom')ruleSetCustom.add_property_regex('^DeleteMe.*$')rc=RedactionConfig(redactResponseURL=True)rc.add_rule_set(ruleSetDefault)rc.add_rule_set(ruleSetCustom)@decorator(redactConfig=rc)defresource_handler(event,context):result=(float(event['ResourceProperties']['Test'])+float(event['ResourceProperties']['Example']))return{'sum':result}Note on Timeouts and PermissionsThe timeout is implemented using asynchronous chained invocationof your Lambda function. For this reason, please be aware of the following limitations:The function must have access to the Lambda API Endpoints in order to self invoke.The function must have permission to self invoke (i.e. lambda:InvokeFunction permission).If your requirements violate any of these conditions, set thetimeoutFunctionoption toFalse. Please also note that this willdoublethe invocations per request, so if you're not in the free tier for Lambda make sure you are aware of this as it may increase costs.ConstantsWe provide three constants for ease of use:Static value : how to accessStatusSUCCESS:accustom.Status.SUCCESSFAILED:accustom.Status.FAILEDRequestTypeCreate:accustom.RequestType.CREATEUpdate:accustom.RequestType.UPDATEDelete:accustom.RequestType.DELETERedactModeBlocklisting :accustom.RedactMode.BLOCKLISTAllowlisting :accustom.RedactMode.ALLOWLISTHow to ContributeFeel free to open issues, fork, or submit a pull request:Issue Tracker:https://github.com/awslabs/cloudformation-accustom/issuesSource Code:https://github.com/awslabs/cloudformation-accustom
accutuning-helpers
No description available on PyPI.
accuweather
accuweatherPython wrapper for getting weather data from AccuWeather API.API keyTo generate API key go tohttps://developer.accuweather.com/user/registerand after registration create an app.How to use package"""Example of usage."""importasyncioimportloggingfromaiohttpimportClientError,ClientSessionfromaccuweatherimport(AccuWeather,ApiError,InvalidApiKeyError,InvalidCoordinatesError,RequestsExceededError,)LATITUDE=52.0677904LONGITUDE=19.4795644API_KEY="xxxxx"logging.basicConfig(level=logging.DEBUG)asyncdefmain():"""Run main function."""asyncwithClientSession()aswebsession:try:accuweather=AccuWeather(API_KEY,websession,latitude=LATITUDE,longitude=LONGITUDE,language="pl",)current_conditions=awaitaccuweather.async_get_current_conditions()forecast_daily=awaitaccuweather.async_get_daily_forecast(days=5,metric=True)forecast_hourly=awaitaccuweather.async_get_hourly_forecast(hours=12,metric=True)except(ApiError,InvalidApiKeyError,InvalidCoordinatesError,ClientError,RequestsExceededError,)aserror:print(f"Error:{error}")else:print(f"Location:{accuweather.location_name}({accuweather.location_key})")print(f"Requests remaining:{accuweather.requests_remaining}")print(f"Current:{current_conditions}")print(f"Forecast:{forecast_daily}")print(f"Forecast hourly:{forecast_hourly}")loop=asyncio.new_event_loop()loop.run_until_complete(main())loop.close()
accuwython
This proyect is intended to ease the access to SocialbAccuWeatherakers API.
acd
Hierarchical neural-net interpretations (ACD) 🧠Produces hierarchical interpretations for a single prediction made by a pytorch neural network. Official code forHierarchical interpretations for neural network predictions(ICLR 2019pdf).Documentation•Demo notebooksNote: this repo is actively maintained. For any questions please file an issue.examples/documentationinstallation:pip install acd(or clone and runpython setup.py install)examples: thereproduce_figsfolder has notebooks with many demossrc: theacdfolder contains the source for the method implementationallows for different types of interpretations by changing hyperparameters (explained in examples)test with pytorch >1.0 with/without gpuall required data/models/code for reproducing are included in thedsetsfolderInspecting NLP sentiment modelsDetecting adversarial examplesAnalyzing imagenet modelsnotes on using ACD on your own datathe current CD implementation doesn't always work for all types of networks. If you are getting an error inside ofcd.py, you may need to write a custom function that iterates through the layers of your network (for examples seecd.py). Should work out-of-the-box for many common layers though, including antyhing in alexnet, vgg, or resnet.to use baselines such build-up and occlusion, replace the pred_ims function by a function, which gets predictions from your model given a batch of examples.related workPDR framework (PNAS 2019pdf) - an overarching framewwork for guiding and framing interpretable machine learningCDEP (ICML 2020pdf,github) - penalizes CD / ACD scores during training to make models generalize betterTRIM (ICLR 2020 workshoppdf,github) - using simple reparameterizations, allows for calculating disentangled importances to transformations of the input (e.g. assigning importances to different frequencies)DAC (arXiv 2019pdf,github) - finds disentangled interpretations for random forestsBaseline interpretability methods - the filescores/score_funcs.pyalso contains simple pytorch implementations ofintegrated gradientsand the simple interpration techniquegradient * inputreferencefeel free to use/share this code openlyif you find this code useful for your research, please cite the following:@inproceedings{singh2019hierarchical,title={Hierarchicalinterpretationsforneuralnetworkpredictions},author={ChandanSinghandW.JamesMurdochandBinYu},booktitle={InternationalConferenceonLearningRepresentations},year={2019},url={https://openreview.net/forum?id=SkEqro0ctQ},}
acd2cwl
acd2cwl provides: - a2c-tools, a generator of wrappers for EMBOSS tools. It uses the formal definition of the tools provided by the ACD files. - a2c-tests, a generator of functional tests for these tools. It uses the functional tests defined in the EMBOSS package itselfInstallInstalling the official package from PyPi:pip install acd2cwlOr from source: ` git clonehttps://github.com/hmenager/acd2cwl.gitcd acd2cwl && python setup.py install `Run tool wrappers generatorSimple command:a2c-tools /usr/share/EMBOSS/acd/*.acdFor more options, just typea2c-tools –helpRun test jobs generatorSimple command:a2c-tests /usr/share/EMBOSS/test/qatest.dat /usr/share/EMBOSS/acd/*.acdFor more options, just typea2c-tests –help
acda
OverviewThis repository stores drug synergy prediction codebase for Augmented Cancer Drug Atlas (ACDA) and the analysis jupyter notebooks which use the ACDA code. We augmented the drug synergy prediction modeling approach CDA described in Narayan et al. by applying a Random Forest Regression and optimization via cross-validation hyperparameter tuning. For ease of sharing and use we implemented it as a python package. See documentation athttps://acda.readthedocs.io.Installation and DependenciesThe main prerequisite o install the Python package is python >3.8 environment. To install run:pip install acdaThe dependencies are installed automatically with the command above. Seesetup.pyfor the very basic dependency list.Methods DescriptionDrug Synergy prediction is a complex problem typically approached with Machine Learning techniques by using molecular and pharmacological data. A recently published method Cancer Drug Atlas, CDA (Narayan et al. 2020), for drug synergy prediction in cell line models uses drug target information, knowledge of genes mutated in each model, and models' monotherapy drug sensitivity. The approach builds a logistic regression model to predict a binary synergy outcome. Here, we improved the CDA drug synergy prediction modeling approach by applying a CART-based model instead of a linear regression.The datails of ACDA methods and benchmarking results are described in thepreprint.Contents| Directory/File | Description ||-----------------|----------------------------|| acda/ | Python source code of the package || docs/ | Source code of the documentation || scripts/ | Scripts and additional functions which are used with the package || ChangeLog.md | File details changes implemented in new releases || LICENSE | The license agreement statement || [Other misc files] | ... |DataThe datailed examples and data requirements are in the documentation atdocumentation.This package makes use of the data outlined below:Sanger molecular data:https://cellmodelpassports.sanger.ac.uk/downloadsGDSC pharmacology data:https://www.cancerrxgene.org/downloads/bulk_downloadDrugComb sensitivity and synergy data:https://drugcomb.org/download/References:Narayan, R. S., Molenaar, P., Teng, J., Cornelissen, F. M. G., Roelofs, I., Menezes, R., Dik, R., et al. (2020).A cancer drug atlas enables synergistic targeting of independent drug vulnerabilities,Nature Communications, 11/1: 2935. Nature Publishing Group.Lianlian Wu, Yuqi Wen, Dongjin Leng, Qinglong Zhang, Chong Dai, Zhongming Wang, Ziqi Liu, Bowei Yan, Yixin Zhang, Jing Wang, Song He and Xiaochen Bo.Machine learning methods, databases and tools for drug combination prediction.Briefings in Bioinformatics, 23(1), 2022, 1-21.Shuyu Zheng, Jehad Aldahdooh, Tolou Shadbahr, Yinyin Wang, Dalal Aldahdooh, Jie Bao, Wenyu Wang, Jing Tang.DrugComb update: a more comprehensive drug sensitivity data repository and analysis portal.Nucleic Acids Research, Volume 49, Issue W1, 2 July 2021, Pages W174-W184.
acdata
No description available on PyPI.
acdb
# Package acdbPackage acdb manages objects between memory and file system.`sh $ pip install acdb `# MemMemDriver cares to store data on memory, this means that MemDriver is fast. Since there is no expiration mechanism, be careful that it might eats up all your memory.`py db = acdb.mem()db.set('name','acdb') assertdb.get('name')== 'acdb' `# DocDocDriver use the OS’s file system to manage data. In general, any high frequency operation is not recommended unless you have an enough reason.`py db =acdb.doc("/tmp/dat")`# LruIn computing, cache algorithms (also frequently called cache replacement algorithms or cache replacement policies) are optimizing instructions, or algorithms, that a computer program or a hardware-maintained structure can utilize in order to manage a cache of information stored on the computer.Caching improves performance by keeping recent or often-used data items in a memory locations that are faster or computationally cheaper to access than normal memory stores. When the cache is full, the algorithm must choose which items to discard to make room for the new ones.Least recently used (LRU), discards the least recently used items first. It has a fixed size(for limit memory usages) and O(1) time lookup.`py db = acdb.lru(1024) `# SynMapDriver is based on DocDriver and use LruDriver to provide caching at its interface layer. The size of LruDriver is always 1024.`py db =acdb.syn("/tmp/dat")`# LicencesMIT
acdc
acdcWanna-be fast string searcherbeta
acdcli
acd_cliacd_cliprovides a command line interface to Amazon Drive and allows Unix users to mount their drive using FUSE for read and (sequential) write access. It is currently in beta stage.Node Cache Featureslocal caching of node metadata in an SQLite databaseaddressing of remote nodes via a pathname (e.g./Photos/kitten.jpg)file searchCLI Featurestree or flat listing of files and folderssimultaneous uploads/downloads, retry on errorbasic plugin supportFile Operationsupload/download of single files and directoriesstreamed upload/downloadfolder creationtrashing/restoringmoving/renaming nodesDocumentationThe full documentation is available athttps://acd-cli.readthedocs.io.Quick StartHave a look at theknown issues, then follow thesetup guideandauthorize. You may then use the program as described in theusage guide.CLI Usage ExampleIn this example, a two-level folder hierarchy is created in an empty drive. Then, a relative local pathlocal/spamis uploaded recursively using two connections.$ acd_cli sync Getting changes... Inserting nodes.. $ acd_cli ls / [PHwiEv53QOKoGFGqYNl8pw] [A] / $ acd_cli mkdir /egg/ $ acd_cli mkdir /egg/bacon/ $ acd_cli upload -x 2 local/spam/ /egg/bacon/ [################################] 100.0% of 100MiB 12/12 654.4KB/s $ acd_cli tree / egg/ bacon/ spam/ sausage spam [...]The standard node listing format includes the node ID, the first letter of its status and its full path. Possible statuses are “AVAILABLE” and “TRASH”.Known IssuesIt is not possible to upload files using Python 3.2.3, 3.3.0 and 3.3.1 due to a bug in the http.client module.API Restrictionsthe current upload file size limit is 50GiBuploads of large files >10 GiB may be successful, yet a timeout error is displayed (please check the upload by syncing manually)storage of node names is case-preserving, but not case-sensitive (this should not concern Apple users)it is not possible to share or delete filesContributeHave a look at thecontributing guidelines.Recent Changes0.3.2added--remove-source-filesargument to upload actionadded--times`argument to download action for preservation of modification timesadded streamed overwrite actionfixed upload of directories containing broken symlinksdisabled FUSE autosync by defaultadded timeout handling for uploads of large filesfixed exit status >=256added config filesadded syncing to/from filefixed download of files with failed (incomplete) chunks0.3.1general improvements for FUSEFUSE write support addedadded automatic loggingsphinx documentation added0.3.0FUSE read support added0.2.2sync speed-upnode listing format changedoptional node listing coloring added (for Linux or via LS_COLORS)re-added possibility for local OAuth0.2.1curl dependency removedadded job queue, simultaneous transfersretry on error0.2.0setuptools supportworkaround for download of files larger than 10 GiBautomatic resuming of downloads
acdc-nn
ACDC-NNACDC-NN is a novel antisymmetric neural network to predict proteins free energy changes upon point variations along the amino acid sequence. The ACDC-NN model was built so that it can be used to make predictions in two different ways:when both the wild-type and variant structure are available, these are respectively used as direct and inverse inputs so that the network can provide a prediction that, by construction, is perfectly antisymmetric;when only the wild-type structure is available, as usual, the input for the inverse substitution is created starting from the direct one by inverting the variation encoding but preserving the same structure.For further information about the ACDC-NN architecture and properties, please see the related paperhttps://doi.org/10.1088/1361-6463/abedfbACDC-NN Seq is a sequence-based version of ACDC-NN that does not require the structure of the protein, further information is available in the paper:https://doi.org/10.3390/genes12060911About this repositoryHere you can find the instructions to easily install ACDC-NN on your computer using pip (see commands below). In this version, ACDC-NN was trained using all datasets available in the literature without correcting for sequence similarity. In case you want to replicate our paper results you will find a jupyter notebook inside the 'results_replication' folder. There ACDC-NN was trained using a 10-fold cross-validation taking into account sequence similarity to avoid overfitting.InstallationWe recommend using pip:pip install acdc-nnRequirements:RequirementMinimum tested versionpython3.6tensorflow2.3.1Biopython1.78numpy1.19.5pandas1.1.5silence_tensorflow1.1.1UsageTo predict the change of the folding free energy (DDG) due to a point substitution in a protein sequence, ACDC-NN needs both evolutionary and structural information about the protein itself. The structural information is from a PDB file. The evolutionary information is from a profile file, simple tab-separated table of the frequencies of each residue in each position in homologous proteins. Positive DDG values are stabilizing.When no structural information is available, the sequence-based ACDC-NN Seq network must be used:acdc-nn seq SUB PROFILEWhen information is available only for the wild-type protein, the predictor can be run as:acdc-nn struct SUB PROFILE PDB CHAINwhere SUB is the point substitution, PROFILE and PDB are the paths to the profile and PDB files, and CHAIN is the PDB chain where the substitution occurs. SUB is in the form XNY where X is the wild-type residue, N is the position of the substitution, and Y is the mutated residue. X and Y are given as a one-letter amino acid code and N is 1-based and referred to the PDB numbering of the relevant chain, and not the position in the sequence. Both PDB and profile files are automatically decompressed when they have a ".gz" extension.When information is available also for the mutated protein, a better prediction can be got as:acdc-nn istruct SUB WT-PROFILE WT-PDB WT-CHAIN INV-SUB MT-PROFILE MT-PDB MT-CHAINTo predict more than a few substitutions, we provide a batch mode:acdc-nn batch SUBSwhere SUBS is the path to a tab-separated table with a row for each substitution to be predicted. For substitutuion where no structural information is available the row format is:SUB PROFILEFor substitutions where only the wild-type protein data is available, the row format is:SUB PROFILE PDB CHAINFor substitutions where also the mutated protein data is available, the row format is:SUB WT-PROFILE WT-PDB WT-CHAIN INV-SUB MT-PROFILE MT-PDB MT-CHAINThe three formats can be mixed arbitrarily in the same file.ExamplesThese examples use the data in the tests directory of the github repository. No structure available:> acdc-nn seq Q104H tests/profiles/2ocjA.prof.gz 0.06451824Single substitution:> acdc-nn struct Q104H tests/profiles/2ocjA.prof.gz tests/structures/2ocj.pdb.gz A 0.15008962Single substitution with the structure of the mutated protein> acdc-nn istruct V51I tests/profiles/1bsaA.prof.gz tests/structures/1bsa.pdb.gz A I51V tests/profiles/1bniA.prof.gz tests/structures/1bni.pdb.gz A 0.48577148 > acdc-nn istruct I51V tests/profiles/1bniA.prof.gz tests/structures/1bni.pdb.gz A V51I tests/profiles/1bsaA.prof.gz tests/structures/1bsa.pdb.gz A -0.48577148NB: In the above example we have specifically chosen two homologous proteins that have similar structure.
acdcreate
Описание съел билибоба :)
acdcreatereal
Описание съел билибоба :)
acdcserver
No description available on PyPI.
acdctools
Acdc ToolsCollection of tools (widgets, utils, io functions, etc.) used byCell-ACDCand spotMAX.
acdecom
No description available on PyPI.
acdh-abbr-client
ABBR ClientA python client to interact with abbr.acdh.oeaw.ac.atFree software: MIT licenseDocumentation:https://acdh-abbr-client.readthedocs.io.FeaturesTODOCreditsThis package was created withCookiecutterand theaudreyr/cookiecutter-pypackageproject template.History0.1.0 (2020-12-13)First release on PyPI.
acdh-arche-assets
Arche AssetsSet of static assets used (mainly) for ARCHE data preprocessing or ARCHE information pages:URI normalization rules used within theACDH-CH.(stored inAcdhArcheAssets/uriNormRules.json)Description of input data formats accepted byARCHE.(stored inAcdhArcheAssets/formats.json)The repository provides also Python 3 and PHP bindings for accessing those assets.Installation & usagePythonInstall using pip3:pip3installacdh-arche-assetsUse withfromAcdhArcheAssets.uri_norm_rulesimportget_rules,get_normalized_uri,get_norm_idprint(f"{get_rules()}")wrong_id="http://sws.geonames.org/1232324343/linz.html"good_id=get_normalized_uri(wrong_id)print(good_id)# "https://sws.geonames.org/1232324343/"# extract ID from URLnorm_id=get_norm_id("http://sws.geonames.org/1232324343/linz.html")print(norm_id)# "1232324343"fromAcdhArcheAssets.file_formatsimportget_formats,get_by_mtype,get_by_extensionformats=get_formats()matching_mapping=get_by_mtype('image/png')matching_mapping=get_by_extension('png')PHPInstall using usingcomposer:composerrequireacdh-oeaw/arche-assetsUsage withrequire_once 'vendor/autoload.php';print_r(acdhOeaw\UriNormRules::getRules());print_r(acdhOeaw\UriNormRules::getRules(['viaf', 'gnd']));print_r(acdhOeaw\ArcheFileFormats::getAll();print_r(acdhOeaw\ArcheFileFormats::getByMime('application/json');print_r(acdhOeaw\ArcheFileFormats::getByExtension('application/json');Description of assetsURI normalization rulesEach rule consists of five properties:name: a rule namematch: a regular expression matching a given URI namespacereplace: a regular expression replace expression normalizing an URI in a given namespaceresolve: a regular expression replace expression transforming an URI in a given namespace to an URL fetching an RDF dataformat: a RDF serialization format to be requested while resolving the URL produced using theresolvefieldFormatsA curated and growing list of file extensions. For each file extension mappings to the respectiveARCHE Resource Type Category(stored inacdh:hasCategory) andMedia Type (MIME type)(stored inacdh:hasFormat) are given. The indicated Media Type should only be used as a fallback; it is best practice to rely on automated Media Type detection based on file signatures.Further information is provided as well.fileExtension: File extension to be mapped.name: Name(s) the format is knownarcheCategory: The corresponding URI of theARCHE Resource Type Category VocabularydataType: A broad category to group formats in; mainly intended for visualisation purposes.pronomID: ID(s) assigned byPRONOMmimeType: Official Media Type(s) (formerly known as MIME types) registered atIANA.informalMimeType: Other MIME types kown for the formatmagicNumber: A constant numerical or text value used to identify a file format, e.g.Wikipedia list of file signaturesianaTemplate: Link to template at IANAreference: Link(s) to format specifications referenced by IANA and otherslongTerm: Indicates if a format is suitable for long-term preservation.Possible values and their meaningyes - long-term formatno - not suitable, another format should be usedrestricted - can be used for long-term preservation in some cases (see comment)unsure - status remains to be evaluatedarcheDocs: Link to a place with more information for the format.comment: Any other noteworthy information not stated elsewhere.Developement (Python)install needed developement packagespip install requirements_dev.txtlinting, tests and testcoverageto run the test:toxcheck coverage and create report:coverage run setup.py testandcoverage htmlcheck lintingflake8
acdh-arche-pyutils
acdh-arche-pyutilsa python client for ARCHE-APIFree software: MITDocumentation:https://acdh-arche-pyutils.readthedocs.io.FeaturesTODOCreditsThis package was created withcookietempleusingCookiecutter.ChangelogThis project adheres toSemantic Versioning.0.4.0 (2021-02-21)ArcheApiClient.get_resource(res_uri)method added to the complete graph of an ARCHE-URIArcheApiClient.write_resource_to_file(res_uri, format=’ttl’)method added to writa a complete graph of an ARCHE-URI to file0.3.0 (2021-02-20)ArcheApiClient.top_col_ids()method added to fetch all TopCollection URIs and Lables0.1.0 (2021-02-19)AddedCreated the project using cookietemple version 1.2.4FixedDependenciesDeprecated
acdh-baserow-pyutils
acdh-baserow-pyutilsa python client for baserowinstallpip install acdh-baserow-pyutilshow to useHave a look intotests/test_baserow_client.pydump all tables of a given database into JSON-FILESimportosfromacdh_baserow_utilsimportBaseRowClient# store baserow credentials as ENV-VariablesBASEROW_USER=os.environ.get("BASEROW_USER")BASEROW_PW=os.environ.get("BASEROW_PW")BASEROW_TOKEN=os.environ.get("BASEROW_TOKEN")# you need to create a token via baserowDATABASE_ID="41426"# you can get this ID from Baserow# initialize the clientbr_client=BaseRowClient(BASEROW_USER,BASEROW_PW,BASEROW_TOKEN)# writes all tables from Database as json.files into a folder 'out' (the folder needs to exist!) and returns a list of the file namesfiles=br_client.dump_tables_as_json(DATABASE_ID,folder_name='out')print(files)# ['out/place.json', 'out/person.json', 'out/profession.json']
acdh-cfts-pyutils
acdh-cfts-pyutilsPython Package to interact with a dedicated Typesense Server and CollectionThis package exposes some constants to interact with a dedicated Typesense Server and Collection through other Python scripts.Also this package is inteded to be the one and only place to modifiy the schema of the given centralised collectioninstallrunpip install acdh-cfts-pyutilsadapt/modifiy environmet variables to fit you needs. Seeenv.defaultfor example. If you use this package to populate the ACDH-CH central fulltext search collection running on ACDH-CH's own Typesense-Server, you'll only need to setTYPESENSE_API_KEYvia an environment varibale.
acdh-cidoc-pyutils
acdh-cidoc-pyutilsHelper functions for the generation of CIDOC CRMish RDF (from XML/TEI data)Installationinstall viapip install acdh-cidoc-pyutilsExamplesFor 'real-world-examples' see e.g.semantic-kraus projectalso take a look intotest_cidoc_pyutils.pyextractcidoc:P14i_performed FRBROO:F51_ Pursuittriples fromtei:person/tei:occupationnodesimportlxml.etreeasETfromrdflibimportURIRefromacdh_cidoc_pyutilsimportmake_occupations,NSMAPsample="""<TEI xmlns="http://www.tei-c.org/ns/1.0"><person xml:id="DWpers0091" sortKey="Gulbransson_Olaf_Leonhard"><persName type="pref">Gulbransson, Olaf</persName><occupation notBefore="1900-12" notAfter="2000" key="#hansi" xml:lang="it">Bürgermeister</occupation><occupation from="1233-02-03" key="#sumsi">Tischlermeister/Fleischhauer</occupation><occupation key="franzi">Sängerin</occupation><occupation>Bäckerin</occupation></person></TEI>"""g,uris=make_occupations(subj,x,"https://foo.bar",id_xpath="@key")print(g.serialize())# returns@prefixns1:<http://www.cidoc-crm.org/cidoc-crm/>. @prefixrdfs:<http://www.w3.org/2000/01/rdf-schema#>. @prefixxsd:<http://www.w3.org/2001/XMLSchema#>.<https://foo/bar/DWpers0091>ns1:P14i_performed<https://foo/bar/DWpers0091/occupation/3>,<https://foo/bar/DWpers0091/occupation/franzi>,<https://foo/bar/DWpers0091/occupation/hansi>,<https://foo/bar/DWpers0091/occupation/sumsi>.<https://foo/bar/DWpers0091/occupation/3>a<http://iflastandards.info/ns/fr/frbr/frbroo#F51>;rdfs:label"Bäckerin"@de.<https://foo/bar/DWpers0091/occupation/franzi>a<http://iflastandards.info/ns/fr/frbr/frbroo#F51>;rdfs:label"Sängerin"@de.<https://foo/bar/DWpers0091/occupation/hansi>a<http://iflastandards.info/ns/fr/frbr/frbroo#F51>;rdfs:label"Bürgermeister"@it;ns1:P4_has_time-span<https://foo/bar/DWpers0091/occupation/hansi/time-span>.<https://foo/bar/DWpers0091/occupation/hansi/time-span>ans1:E52_Time-Span;rdfs:label"1900-12 - 2000"^^xsd:string;ns1:P82a_begin_of_the_begin"1900-12"^^xsd:gYearMonth;ns1:P82b_end_of_the_end"2000"^^xsd:gYear.<https://foo/bar/DWpers0091/occupation/sumsi>a<http://iflastandards.info/ns/fr/frbr/frbroo#F51>;rdfs:label"Tischlermeister/Fleischhauer"@de;ns1:P4_has_time-span<https://foo/bar/DWpers0091/occupation/sumsi/time-span>.<https://foo/bar/DWpers0091/occupation/sumsi/time-span>ans1:E52_Time-Span;rdfs:label"1233-02-03 - 1233-02-03"^^xsd:string;ns1:P82a_begin_of_the_begin"1233-02-03"^^xsd:date;ns1:P82b_end_of_the_end"1233-02-03"^^xsd:date.extract birth/death triples fromtei:personimportlxml.etreeasETfromrdflibimportURIReffromacdh_cidoc_pyutilsimportmake_birth_death_entities,NSMAPsample="""<TEI xmlns="http://www.tei-c.org/ns/1.0"><person xml:id="DWpers0091" sortKey="Gulbransson_Olaf_Leonhard"><persName type="pref">Gulbransson, Olaf</persName><birth when="1873-05-26">26. 5. 1873<placeName key="#DWplace00139">Christiania (Oslo)</placeName></birth><death><date notBefore-iso="1905-07-04" when="1955" to="2000">04.07.1905</date><settlement key="pmb50"><placeName type="pref">Wien</placeName><location><geo>48.2066 16.37341</geo></location></settlement></death></person></TEI>"""doc=ET.fromstring(sample)x=doc.xpath(".//tei:person[1]",namespaces=NSMAP)[0]xml_id=x.attrib["{http://www.w3.org/XML/1998/namespace}id"].lower()item_id=f"https://foo/bar/{xml_id}"subj=URIRef(item_id)event_graph,birth_uri,birth_timestamp=make_birth_death_entities(subj,x,place_id_xpath="//tei:placeName[1]/@key)event_graph,birth_uri,birth_timestamp=make_birth_death_entities(subj,x,event_type="death",verbose=True,date_node_xpath="/tei:date[1]",place_id_xpath="//tei:settlement[1]/@key")event_graph.serialize(format="turtle")# returns@prefixns1:<http://www.cidoc-crm.org/cidoc-crm/>. @prefixrdfs:<http://www.w3.org/2000/01/rdf-schema#>. #birthexample<https://foo/bar/dwpers0091/birth>ans1:E67_Birth;rdfs:label"Geburt von Gulbransson, Olaf Leonhard"@fr;ns1:P4_has_time-span<https://foo/bar/dwpers0091/birth/time-span>;ns1:P7_took_place_at<https://foo/bar/DWplace00139>;ns1:P98_brought_into_life<https://foo/bar/dwpers0091>.<https://foo/bar/dwpers0091/birth/time-span>ans1:E52_Time-Span;rdfs:label"1873-05-26 - 1873-05-26"^^xsd:string;ns1:P82a_begin_of_the_begin"1873-05-26"^^xsd:date;ns1:P82b_end_of_the_end"1873-05-26"^^xsd:date. #deathexample<https://foo/bar/dwpers0091/death>ans1:E69_Death;rdfs:label"Geburt von Gulbransson, Olaf Leonhard"@fr;ns1:P100_was_death_of<https://foo/bar/dwpers0091>;ns1:P7_took_place_at<https://foo/bar/pmb50>ns1:P4_has_time-span<https://foo/bar/dwpers0091/death/time-span>.<https://foo/bar/dwpers0091/death/time-span>ans1:E52_Time-Span;rdfs:label"1905-07-04 - 2000"^^xsd:string;ns1:P82a_begin_of_the_begin"1905-07-04"^^xsd:date;ns1:P82b_end_of_the_end"2000"^^xsd:gYear.createns1:P168_place_is_defined_by "Point(456 123)"^^<geo:wktLiteral> .from tei:coordsimportlxml.etreeasETfromrdflibimportGraph,URIRef,RDFfromacdh_cidoc_pyutilsimportcoordinates_to_p168,NSMAP,CIDOCsample="""<TEI xmlns="http://www.tei-c.org/ns/1.0"><place xml:id="DWplace00092"><placeName type="orig_name">Reval (Tallinn)</placeName><location><geo>123 456</geo></location></place></TEI>"""doc=ET.fromstring(sample)g=Graph()forxindoc.xpath(".//tei:place",namespaces=NSMAP):xml_id=x.attrib["{http://www.w3.org/XML/1998/namespace}id"].lower()item_id=f"https://foo/bar/{xml_id}"subj=URIRef(item_id)g.add((subj,RDF.type,CIDOC["E53_Place"]))g+=coordinates_to_p168(subj,x)print(g.serialize())# returns...ns1:P168_place_is_defined_by"Point(456 123)"^^<geo:wktLiteral>. ...Function parameterverboseprints information in case the given xpath does not return expected results which is a text node with two numbers separated by a given separator (default value isseparator=" ")Function parameterinverse(default:inverse=False) changes the order of the coordinates.date-like-string to casted rdflib.Literalfromacdh_cidoc_pyutilsimportdate_to_literalddates=["1900","1900-01","1901-01-01","foo",]forxindates:date_literal=date_to_literal(x)print((date_literal.datatype))# returns# http://www.w3.org/2001/XMLSchema#gYear# http://www.w3.org/2001/XMLSchema#gYearMonth# http://www.w3.org/2001/XMLSchema#date# http://www.w3.org/2001/XMLSchema#stringmake some random URIfromacdh_cidoc_pyutilsimportmake_uridomain="https://hansi4ever.com/"version="1"prefix="sumsi"uri=make_uri(domain=domain,version=version,prefix=prefix)print(uri)# https://hansi4ever.com/1/sumsi/6ead32b8-9713-11ed-8065-65787314013curi=make_uri(domain=domain)print(uri)# https://hansi4ever.com/8b912e66-9713-11ed-8065-65787314013ccreate an E52_Time-Span graphfromacdh_cidoc_pyutilsimportcreate_e52,make_uriuri=make_uri()e52=create_e52(uri,begin_of_begin="1800-12-12",end_of_end="1900-01")print(e52.serialize())# returns#@prefixns1:<http://www.cidoc-crm.org/cidoc-crm/>. #@prefixrdfs:<http://www.w3.org/2000/01/rdf-schema#>. #@prefixxsd:<http://www.w3.org/2001/XMLSchema#>. #<https://hansi4ever.com/387fb457-971b-11ed-8065-65787314013c>ans1:E52_Time-Span;#rdfs:label"1800-12-12 - 1900-01"^^xsd:string;#ns1:P82a_begin_of_the_begin"1800-12-12"^^xsd:date;#ns1:P82b_end_of_the_end"1900-01"^^xsd:gYearMonth.creates E42 from tei:org|place|persontakes a tei:person|place|org node, extracts their@xml:idand alltei:idnoelements, derivesidoc:E42_Identifiertriples and relates them to a passed in subject viacidoc:P1_is_identified_byimportlxml.etreeasETfromrdflibimportGraph,URIRef,RDFfromacdh_cidoc_pyutilsimportmake_e42_identifiers,NSMAP,CIDOCsample="""<TEI xmlns="http://www.tei-c.org/ns/1.0"><place xml:id="DWplace00092"><placeName type="orig_name">Reval (Tallinn)</placeName><placeName xml:lang="de" type="simple_name">Reval</placeName><placeName xml:lang="und" type="alt_label">Tallinn</placeName><idno type="pmb">https://pmb.acdh.oeaw.ac.at/entity/42085/</idno><idno type="URI" subtype="geonames">https://www.geonames.org/588409</idno><idno subtype="foobarid">12345</idno></place></TEI>"""doc=ET.fromstring(sample)g=Graph()forxindoc.xpath(".//tei:place|tei:org|tei:person|tei:bibl",namespaces=NSMAP):xml_id=x.attrib["{http://www.w3.org/XML/1998/namespace}id"].lower()item_id=f"https://foo/bar/{xml_id}"subj=URIRef(item_id)g.add((subj,RDF.type,CIDOC["E53_Place"]))g+=make_e42_identifiers(subj,x,type_domain="http://hansi/4/ever",default_lang="it",)print(g.serialize(format="turtle"))# returns@prefixns1:<http://www.cidoc-crm.org/cidoc-crm/>. @prefixowl:<http://www.w3.org/2002/07/owl#>. @prefixrdfs:<http://www.w3.org/2000/01/rdf-schema#>.<https://foo/bar/dwplace00092>ans1:E53_Place;ns1:P1_is_identified_by<https://foo/bar/dwplace00092/identifier/DWplace00092>,<https://foo/bar/dwplace00092/identifier/idno/0>,<https://foo/bar/dwplace00092/identifier/idno/1>,<https://foo/bar/dwplace00092/identifier/idno/2>;owl:sameAs<https://pmb.acdh.oeaw.ac.at/entity/42085/>,<https://www.geonames.org/588409>.<http://hansi/4/ever/idno/URI/geonames>ans1:E55_Type.<http://hansi/4/ever/idno/foobarid>ans1:E55_Type.<http://hansi/4/ever/idno/pmb>ans1:E55_Type.<http://hansi/4/ever/xml-id>ans1:E55_Type.<https://foo/bar/dwplace00092/identifier/DWplace00092>ans1:E42_Identifier;rdfs:label"Identifier: DWplace00092"@it;rdf:value"DWplace00092";ns1:P2_has_type<http://hansi/4/ever/xml-id>.<https://foo/bar/dwplace00092/identifier/idno/0>ans1:E42_Identifier;rdfs:label"Identifier: https://pmb.acdh.oeaw.ac.at/entity/42085/"@it;rdf:value"https://pmb.acdh.oeaw.ac.at/entity/42085/";ns1:P2_has_type<http://hansi/4/ever/idno/pmb>.<https://foo/bar/dwplace00092/identifier/idno/1>ans1:E42_Identifier;rdfs:label"Identifier: https://www.geonames.org/588409"@it;rdf:value"https://www.geonames.org/588409"ns1:P2_has_type<http://hansi/4/ever/idno/URI/geonames>.<https://foo/bar/dwplace00092/identifier/idno/2>ans1:E42_Identifier;rdfs:label"Identifier: 12345"@it;rdf:value"12345";ns1:P2_has_type<http://hansi/4/ever/idno/foobarid>.creates appellations from tei:org|place|persontakes a tei:person|place|org node, extractspersName, placeName and orgNametexts,@xml:langand custom type values and returnscidoc:E33_41andcidoc:E55nodes linked viacidoc:P1_is_identified_byandcidoc:P2_has_typeimportlxml.etreeasETfromrdflibimportGraph,URIRef,RDFfromacdh_cidoc_pyutilsimportmake_appellations,NSMAP,CIDOCsample="""<TEI xmlns="http://www.tei-c.org/ns/1.0"><place xml:id="DWplace00092"><placeName type="orig_name">Reval (Tallinn)</placeName><placeName xml:lang="de" type="simple_name">Reval</placeName><placeName xml:lang="und" type="alt_label">Tallinn</placeName><idno type="pmb">https://pmb.acdh.oeaw.ac.at/entity/42085/</idno></place></TEI>"""doc=ET.fromstring(sample)g=Graph()forxindoc.xpath(".//tei:place|tei:org|tei:person|tei:bibl",namespaces=NSMAP):xml_id=x.attrib["{http://www.w3.org/XML/1998/namespace}id"].lower()item_id=f"https://foo/bar/{xml_id}"subj=URIRef(item_id)g.add((subj,RDF.type,CIDOC["E53_Place"]))g+=make_appellations(subj,x,type_domain="http://hansi/4/ever",default_lang="it")g.serialize(format="ttl")# returns@prefixns1:<http://www.cidoc-crm.org/cidoc-crm/>. @prefixrdfs:<http://www.w3.org/2000/01/rdf-schema#>.<https://foo/bar/dwplace00092>ans1:E53_Place;ns1:P1_is_identified_by<https://foo/bar/dwplace00092/appellation/0>,<https://foo/bar/dwplace00092/appellation/1>,<https://foo/bar/dwplace00092/appellation/2>.<http://hansi/4/ever/alt-label>ans1:E55_Type;rdfs:label"alt_label".<http://hansi/4/ever/orig-name>ans1:E55_Type;rdfs:label"orig_name".<http://hansi/4/ever/simple-name>ans1:E55_Type;rdfs:label"simple_name".<https://foo/bar/dwplace00092/appellation/0>ans1:E33_E41_Linguistic_Appellation;rdfs:label"Reval (Tallinn)"@it;ns1:P2_has_type<http://hansi/4/ever/orig-name>.<https://foo/bar/dwplace00092/appellation/1>ans1:E33_E41_Linguistic_Appellation;rdfs:label"Reval"@de;ns1:P2_has_type<http://hansi/4/ever/simple-name>.<https://foo/bar/dwplace00092/appellation/2>ans1:E33_E41_Linguistic_Appellation;rdfs:label"Tallinn"@und;ns1:P2_has_type<http://hansi/4/ever/alt-label>.normalize_stringfromacdh_cidoc_pyutilsimportnormalize_stringstring="""\n\nhallomein schatz ich liebe dichdu bist die einzige für mich"""print(normalize_string(string))# returns# hallo mein schatz ich liebe dich du bist die einzige für michextract date attributes (begin, end)expects typical TEI date attributes like@when, @when-iso, @notBefore, @notAfter, @from, @to, ...and returns a tuple containg start- and enddate values. If only@when or @when-isoor only@notBefore or @notAfterare provided, the returned values are the same, unless the default parameterfill_missingis set toFalse.fromlxml.etreeimportElementfromacdh_cidoc_pyutilsimportextract_begin_enddate_string="1900-12-12"date_object=Element("{http://www.tei-c.org/ns/1.0}tei")date_object.attrib["when-iso"]=date_stringprint(extract_begin_end(date_object))# returns# ('1900-12-12', '1900-12-12')date_string="1900-12-12"date_object=Element("{http://www.tei-c.org/ns/1.0}tei")date_object.attrib["when-iso"]=date_stringprint(extract_begin_end(date_object,fill_missing=False))# returns# ('1900-12-12', None)date_object=Element("{http://www.tei-c.org/ns/1.0}tei")date_object.attrib["notAfter"]="1900-12-12"date_object.attrib["notBefore"]="1800"print(extract_begin_end(date_object))# returns# ('1800', '1900-12-12')developmentpip install -r requirements_dev.txtflake8-> lintingcoveage run -m pytest-> runs tests and creates coverage stats
acdh-collatex-utils
CollateX UtilsA python package to collate things with collate-xinstallcreate a virtual environment and install the package withpip install acdh_collatex_utilsuseTo collate a bunch of XML/TEI documents located in e.g../to_collateruncollate -g './to_collate/*.xmlThis creates a folder./to_collate/collatedand saves chunked HTML and TEI Files likeout__001.htmlorout__001.teiTo see this package in use in a real life project, please check outfrd-data/collate_work.pydevelopcreate a virutal environmentinstall dev-requirementspip install -U pipandpip install -r requirements_dev.txtinstall the package in dev-modepip install -e .run test withcoverage run -m pytest -vcreate test-reportcoverage reportorcoverage html
acdh-django-archeutils
Django-App to serialize a model class into arche-rdfQuickstartInstall acdh-archeutils:pip install acdh-django-archeutilsAdd it to yourINSTALLED_APPS:INSTALLED_APPS=(...'archeutils',...)Update your project’s urls.py:urlpatterns=[...url(r'^archeutils/',include('archeutils.urls',namespace='archeutils')),...]
acdh-django-browsing
acdh-django-browsingDjango-App providing some useful things to create browsing viewsQuickstartInstall acdh-django-browsing:pip install acdh-django-browsingAdd it to yourINSTALLED_APPS:INSTALLED_APPS=(...'browsing',...)
acdh-django-charts
An app to explore your data through charts based on Highcharts.jsDocumentationThe full documentation is athttps://acdh-django-charts.readthedocs.io.QuickstartInstall django_charts:pip install acdh-django-chartsAdd it to yourINSTALLED_APPS:INSTALLED_APPS=(...'charts',...)Add django_charts’s URL patterns:urlpatterns=[...url(r'^charts/',include('charts.urls',namespace='charts')),...]By default the app’s templates extend a base templatewebpage/base.html. To ovveride this, just define aCHARTS_BASE_TEMPLATEvariable on your project’ssettings.pylike e.g:CHARTS_BASE_TEMPLATE='base.html'To link to the application’s ‘chart-selector-view’ you can add something like the snippet below to your e.g. base-template:<ahref="{% url 'charts:chart_selector' %}">Charts</a>ConfigurationTo visualize any property of your model you have to pass in the models name (lowercase), the field-path (using django’s lookup syntax__to follow foreign key and many2many relations) and the chart type (bar|line|pie) via keyword arguments to thecharts.views.DynChartView(). In case those params are valid (i.d. the model and the lookup path acutally exist) the according chart should be drawn. But be aware that this only works if your project’sDEBUGsettings are set toTrue. Asrecomendedalternative you should createChartConfigobjects for each property/model you’d like to explore via django admin-backend.management commandsThe package ships with a management command tocreate/delete chartconfig objects (Bar, Pie, Linecharts)python manage.py create_charttypespython manage.py delete_charttypescreate/delete ChartConfig objects per applicationpython manage.py create_charts <app_name>python manage.py delete_charts <app_name>Build and publishpython setup.py sdist bdist_wheel twine upload dist/*FeaturesVisualizes aggregated values of your models as charts (pie/bar/line) usinghttps://www.highcharts.com/Charts can be configured via admin backend (see Configuration Section)Running TestsDoes the code actually work?source <YOURVIRTUALENV>/bin/activate (myenv) $ pip install tox (myenv) $ toxCreditsTools used in rendering this package:Cookiecuttercookiecutter-djangopackageHistory0.5.4 (2019-10-11)values always returned as strings -> BooleanFields are displayed now0.5.3 (2019-02-12)reworte code to fetch payload data to avoid mysterious duplicated values0.5.2 (2018-12-18)improved admin interface for ChartConfig0.5.1 (2018-12-05)added management commands to create ChartType and ChartConfig objects.0.5.0 (2018-10-25)addedapp_nameparam to ChartConfig to avoid ambiguity in case models in different apps do have the same name.0.4.1 (2018-07-12)minor change in dropdown template tag0.4.0 (2018-07-10)refactoring of templates by introducing template tags0.3.0 (2018-06-13)removed work in progress banner0.3.0 (2018-06-05)In case of DEBUG=False only fieldpaths/models can be explored which are registerd in dedicated ChartConfig objects.0.2.0 (2018-06-01)Base templates can now be configured in settings-param0.1.0 (2018-06-01)First release on PyPI.
acdh-django-filechecker
Django-App to store, edit, enrich and serialize the results ofhttps://github.com/acdh-oeaw/repo-file-checkerQuickstartInstall acdh-django-filechecker:pip install acdh-django-filecheckerAdd it to yourINSTALLED_APPS:INSTALLED_APPS=(...'filechecker',...)Update your project’s urls.py:urlpatterns=[...url(r'^filechecker-rdf/',include('filechecker.fc_arche_urls',namespace='filechecker-rdf')),url(r'^filechecker/',include('filechecker.urls',namespace='filechecker')),...]
acdh-django-geonames
A django package providing models and views for Geoname PlacesDocumentationThe full documentation is athttps://acdh-django-geonames.readthedocs.io.QuickstartInstall GeoName Places:pip install acdh-django-geonamesAdd it to yourINSTALLED_APPS:INSTALLED_APPS=(...'gn_places.apps.GnPlacesConfig',...)Add GeoName Places’s URL patterns:fromgn_placesimporturlsasgn_places_urlsurlpatterns=[...url(r'^',include(gn_places_urls)),...]FeaturesTODORunning TestsDoes the code actually work?source <YOURVIRTUALENV>/bin/activate (myenv) $ pip install tox (myenv) $ toxDevelopment commandspip install -r requirements_dev.txt invoke -lCreditsTools used in rendering this package:Cookiecuttercookiecutter-djangopackageHistory0.1.0 (2021-01-11)First release on PyPI.
acdh-django-handle
=============================acdh-django-handle=============================.. image:: https://badge.fury.io/py/acdh-django-handle.svg:target: https://badge.fury.io/py/acdh-django-handleA django app to create and manage handle-pids_.Quickstart----------Install acdh-django-handle::pip install acdh-django-handleAdd it to your `INSTALLED_APPS`:.. code-block:: pythonINSTALLED_APPS = (...'handle',...)Provide a handle-config dict:.. code-block:: pythonHANDLE = {'resolver': "http://hdl.handle.net",'user': "your handle-provider user",'pw': "your handle-provider password",'url': "base url to your handle-provider api",'app_base_url': "the base url of your application"}example:.. code-block:: pythonHANDLE = {'resolver': "http://hdl.handle.net",'user': "user11.1234567-01",'pw': "password1234",'url': "http://pid.gwdg.de/handles/11.1234567-01/",'app_base_url': "https://myproject.com"}The value of `app_base_url` will be concaneted with the value of the `get_absolute_url` method of the model instance you want to register a handle for.And runpython manage.py migrate handleCreate/register handle-pids----The package provides a management command to bulk create/register handle-pids. For this you'll have to* add a `GenericRelation` property to the model class you would like register handles for* and make sure you have a `get_absolute_url` method defined.. code-block:: pythonfrom django.contrib.contenttypes.fields import GenericRelationfrom handle.models import Pid...class Example(models.Model):name = models.CharField(max_length=255, blank=True,)...pid = GenericRelation(Pid, blank=True, null=True, related_query_name="get_pid")...def get_absolute_url(self):return reverse('example_detail', kwargs={'pk': self.id})To register/create handle-pids run:python manage.py crate_handles exampleIn case your GenericRelation property is named something else than `pid` you need to pass in the name as second argument, e.g:python manage.py crate_handles example --pid=<name>Handle-Pids will only be crated for objects which do not have a handle-pid yet.Features--------* Provides a `Pid` class which stores* a handle-pid* creation and modification date* a generic relation to any other class of your django project.* an overidden save-method which will register/create a handle-pid on save in case you didn't provide a handle-pid* Provides a `handle.utils.create_handle` function to register/create a new handle-pid* Register/Create handle-pid for any objects in your project via admin-interface.* Provides a management command to bulk create/register handle-pids for all instances of a model-class in your project.Build and publish-----.. code-block:: consolepython setup.py sdist bdist_wheeltwine upload dist/*Credits-------Tools used in rendering this package:* Cookiecutter_* `cookiecutter-djangopackage`_.. _Cookiecutter: https://github.com/audreyr/cookiecutter.. _`cookiecutter-djangopackage`: https://github.com/pydanny/cookiecutter-djangopackage.. _handle-pids: http://www.handle.net/History-------0.1.0 (2018-06-28)++++++++++++++++++* First release on PyPI.
acdh-django-netvis
App to visualize model objects as network graphQuickstartInstall acdh-django-netvis:pip install acdh_django_netvisAdd it to yourINSTALLED_APPS:INSTALLED_APPS=(...'netvis',...)Add acdh-django-netvis’s URL patterns:urlpatterns=[...url(r'^netvis/',include('netvis.urls',namespace="netvis")),...]Documentationhttps://acdh-django-netvis.readthedocs.io.FeaturesTODORunning TestsDoes the code actually work?source <YOURVIRTUALENV>/bin/activate (myenv) $ python manage.py testCreditsTools used in rendering this package:Cookiecuttercookiecutter-djangopackageHistory2.0.0 (2021-01-29)use load static instead of load staticfiles.1.0.0 (2020-05-03)netvis-JS library does not load latest version any more.0.1.1 (2019-11-18)Included a legend.0.1.0 (2019-11-12)First release on PyPI.
acdh-django-sirad
parse a SIRAD-ARCHIV and generates a django-app out of it. Also imports the data from the SIRAD-ARCHIV into your django app.
acdh-django-sparql
Django-App providing a query interface and proxy for any common Triplestore.QuickstartInstall acdh-django-sparql:pip install acdh-django-sparqlAdd it to yourINSTALLED_APPS:INSTALLED_APPS=(...'sparql',...)Add django_sparql’s URL patterns:urlpatterns=[...url(r'^sparql/',include('sparql.urls',namespace='sparql')),...]Provide the endpoint and optional some log-in credentials for your Triplestore in some settings file:BG_URL="https://path-to-your-triple-store/sparql"BG_USER="username"BG_PW="password"browse tohttps://my-project/sparql/query/to reach the query interfaceVia the django-admin interface you can create sample queries.LicensingAll code unless otherwise noted is licensed under the terms of the MIT License (MIT). Please refer to the file LICENSE in the root directory of this repository.
acdh-django-transkribus
A django app for interacting with the [Transkribus-API](https://transkribus.eu/wiki/index.php/REST_Interface) to search and read documents hosted and processed by [Transkribus](https://transkribus.eu/Transkribus/)Installationpip install acdh-django-transkribusUse:Add your user name and password and the ID of the collection you’d like to expose by the current application’s settings file like shown below:TRANSKRIBUS={"user":"[email protected]","pw":"mytranskribuspassword","col_id":"43497"}To make the faksimiles only accessible for logged in user, you need to add the following flag to your settings:For custom translations you’d need to add following dict to your settings:TRANSKRIBUS_PUBLIC=TrueTRANSKRIBUS_TRANSLATIONS={'search_form':{'prefil':'search in all documents','button':'Go!'},'search_header':{'header':'Fulltext Search'},'hits':{'facet_header':'refine your search','facet_doc_title':'Document','result_header':'Results','kwic_header':'KWIC','result_col':'Collection','result_doc':'Document','result_page':'Page','result_link':'go to document'},'page':{'img_col':'IMG','text_col':'TEXT'},'docs':{'title_col':'Title','page_nr_col':'Nr. of pages','preview_col':'Preview','doc_singular':'Document','doc_plural':'Documents','page_singular':'Page','page_plural':'Pages'}}History1.1.0 (2020-03-20)added possiblity to restrict access to non logged-in users to Facsimiles by adding TRANSKRIBUS_PUBLIC = False to settings1.0.0 (2020-03-20)load staticfiles -> load static to be Django > 3.x compatible0.3.0 (2019-10-24)more translationstranskribus-css classes added to templatesadded TrpBaseModel0.2.0 (2019-10-09)Added settings param for custom translations0.1.0 (2019-10-02)First release.
acdh-django-vocabs
Curate controlled vocabularies as SKOSDocumentationThe full documentation is athttps://acdh-django-vocabs.readthedocs.io.QuickstartInstall ACDH Django Vocabs:pip install acdh-django-vocabsAdd it to yourINSTALLED_APPS:INSTALLED_APPS=(...'vocabs.apps.VocabsConfig',...)Add ACDH Django Vocabs’s URL patterns:fromvocabsimporturlsasvocabs_urlsurlpatterns=[...url(r'^',include(vocabs_urls)),...]FeaturesTODORunning TestsDoes the code actually work?source <YOURVIRTUALENV>/bin/activate (myenv) $ pip install tox (myenv) $ toxDevelopment commandspip install -r requirements_dev.txt invoke -lCreditsTools used in rendering this package:Cookiecuttercookiecutter-djangopackagerm -rf ./dist python setup.py sdist bdist_wheel twine upload dist/*History0.1.0 (2021-01-05)First release on PyPI.
acdh-django-zotero
A django package to store and process zotero items
acdh-geonames-utils
Geonames UtilsUtility functions to interact with geonames.orgFree software: MIT licenseDocumentation:https://geonames-utils.readthedocs.io/FeaturesTo use Geonames Utils in a project:from acdh_geonames_utils import acdh_geonames_utils as gn geonames_df = gn.dwonload_to_df('AT') geonames_df.head() # prints the first n rowsCreditsThis package was created withCookiecutterand theaudreyr/cookiecutter-pypackageproject template.History0.2.0 (2021-01-07)First release on PyPI.
acdh-graph-pyutils
ACDH Graph Python UtilitiesContains a set of utilities to work with XML Documents and create RDF Graphs.
acdh-handle-pyutils
acdh-handle-pyutilsUtility functions to interact with handle.net APIinstallpip install acdh-handle-pyutilshow to usesee./tests/test_client.pyand also checkout the provided defaults foracdh_handle_pyutils.client.HandleClientregister handle for urlimportosfromacdh_handle_pyutils.clientimportHandleClientHANDLE_USERNAME=os.environ.get("HANDLE_USERNAME")HANDLE_PASSWORD=os.environ.get("HANDLE_PASSWORD")URL_TO_REGISTER="https://id.hansi4ever.com/123"cl=HandleClient(HANDLE_USERNAME,HANDLE_PASSWORD)result=cl.register_handle(URL_TO_REGISTER,full_url=True)print(result)# https://hdl.handle.net/21.11115/0000-000F-743B-DBe aware that it might take a while until the registerd handle resolvesupdate handleimportosfromacdh_handle_pyutils.clientimportHandleClientHANDLE_USERNAME=os.environ.get("HANDLE_USERNAME")HANDLE_PASSWORD=os.environ.get("HANDLE_PASSWORD")HANDLE_TO_UPDATE="https://hdl.handle.net/21.11115/0000-000F-743B-D"URL_TO_UPDATE="https://sumsi.com/is-the-best"cl=HandleClient(HANDLE_USERNAME,HANDLE_PASSWORD)updated=cl.update_handle(HANDLE_TO_UPDATE,URL_TO_UPDATE)print(result)print(updated.status_code)# should return for `204 No Content` HTTP response code for a successful update# 204Be aware that it might take until the handle is actually updated by the handle service provider
acdh-histogis
acdh-histogis is a python package providing a high level api to interact with [HistoGIS](https://histogis.acdh.oeaw.ac.at/)Installationpip install acdh-histogisUse:fromhistogis.histogisimportHistoGisashg# by geonames (id or URL)hg().query_by_service_id(id="https://www.geonames.org/2772400/",when='1860-12-12',polygon=False)# by wikidata (id or URL)hg().query_by_service_id(id="https://www.wikidata.org/wiki/Q41329",when='1860-12-12',polygon=False)# by GND (id or URL)hg().query_by_service_id(service="gnd",id="4074255-6",when='1860-12-12',polygon=False)# returns:{'id':8118,'wikidata_id':'','name':'Linz (Stadt)','alt_name':'','source':'https://histogis.acdh.oeaw.ac.at/api/source/93/?format=json','source_name':'Cisleithania Districts 1880','administrative_unit':'https://histogis.acdh.oeaw.ac.at/api/skosconcepts/135/?format=json','adm_name':'Statutarstadt','start_date':'1850-01-01','end_date':'1918-10-31',...}LicensingAll code unless otherwise noted is licensed under the terms of the MIT License (MIT). Please refer to the file LICENSE in the root directory of this repository.History0.0.5 (2020-03-25)some modifications to dump data methods addedfunction to create singel JSONL file from single files added0.0.4 (2019-04-02)dump data methods added0.0.3 (2019-03-26)code refactoringnew methodsquery_by_service_id to fetch data by geonames, gnd and wikidata ID/URL0.0.2 (2019-03-11)fetch_geonames_rdfandquery_by_geonames_idmethods added0.0.1 (2019-03-11)First version
acdh-id-reconciler
acdh-id-reconcilerpython package to reconcile GND and GeoNames IDs via WikiData.installpip install acdh-id-reconcilerusefrom GND to WikiData and GeoNames IDfromacdh_id_reconcilerimportgnd_to_geonamestest="https://d-nb.info/gnd/4010858-2"results=gnd_to_geonames(test)print(results)# {'wikidata': 'http://www.wikidata.org/entity/Q261664', 'gnd': '4010858-2', 'geonames': '2781124'}from GND to WikiDatafromacdh_id_reconcilerimportgnd_to_wikidatatest="https://d-nb.info/gnd/4074255-6"results=gnd_to_wikidata(test)print(results)# {'wikidata': 'http://www.wikidata.org/entity/Q41329', 'gnd': '4074255-6'}from GND to WikiData plus Custom-IDfromacdh_id_reconcilerimportgnd_to_wikidata_customtest="https://d-nb.info/gnd/118634712"custom="P6194"# https://www.wikidata.org/wiki/Property:P6194results=gnd_to_wikidata_custom(test,custom)print(results)# {'wikidata': 'http://www.wikidata.org/entity/Q215747', 'gnd': '118634712', 'custom': 'W/Wolf_Hugo_1860_1903'}from Geonames to WikiDatafromacdh_id_reconcilerimportgeonames_to_wikidatatest="https://www.geonames.org/2761369"results=geonames_to_wikidata(test)print(results)# {'wikidata': 'http://www.wikidata.org/entity/Q1741', 'geonames': '2761369'}from Geonames to GNDfromacdh_id_reconcilerimportgeonames_to_gndtest="https://www.geonames.org/2761369"results=geonames_to_gnd(test)print(results)# {'wikidata': 'http://www.wikidata.org/entity/Q1741', 'geonames': '2761369', 'gnd': '4066009-6'}from Wikidata to Wikipediafromacdh_id_reconcilerimportwikidata_to_wikipediatest="https://www.wikidata.org/wiki/Q1186567/"result=wikidata_to_wikipedia(test)print(result)# 'https://de.wikipedia.org/wiki/Alexandrinski-Theater'# default language is set to german, can be changed by settings param result e.g. `wiki_lang='enwiki'`result=wikidata_to_wikipedia(test,wiki_lang='enwiki')print(result)# 'https://en.wikipedia.org/wiki/Alexandrinsky_Theatre'
acdh-obj2xml-pyutils
acdh-obj2xml-pyutilspython library to parse BaseRowClient of acdh-baserow-pyutils or any array of objects.HowTo developercreate python environmentpython -m venv envinstallpip install acdh_obj2xml_pyutils(not yet published)installpip install acdh_baserow_pyutils(not yet published)create python file e.g.run.pyadd codefromacdh_obj2xml_pyutilsimportObjectToXmlbr_input=[{"id":"test1","filename":"test1"},{"id":"test2","filename":"test2"}]tei=ObjectToXml(br_input=br_input)output=[xforxintei.make_xml(save=True)]print(output)with BaseRowClientfromacdh_obj2xml_pyutilsimportObjectToXmlfromacdh_baserow_pyutilsimportBaseRowClientbr_client=BaseRowClient(br_base_url="add url",br_table_id='add id',br_token='add token')br_input=[xforxinbr_client.yield_rows()]tei=ObjectToXml(br_input=br_input)output=[xforxintei.make_xml(save=True)]print(output)both versions will create an 'out' directory containing xml files. Important Note! To create filenames the data input must contain a variable with filennames. Default variable is called 'filename' but can be customized by providing an argument to class ObjectToXml().Argumentsbr_inputdata input as array of objectssave_dirdefault out as stringfilenamevariable for filenames as stringtemplate_pathpath for jinja2 template
acdh-spacyal
Spacy Active learning====================.. image:: https://zenodo.org/badge/130271493.svg:target: https://zenodo.org/badge/latestdoi/130271493Django app that uses active learning (deliberately picking the examples to annotate) to retrain the spaCy_ NER module more effectively.Prerequisites-------------For spacyal to run you need a working Celery_ installation. Something along the lines of::from __future__ import absolute_import, unicode_literalsimport osfrom celery import Celeryapp = Celery('tasks')# Using a string here means the worker doesn't have to serialize# the configuration object to child processes.# - namespace='CELERY' means all celery-related configuration keys# should have a `CELERY_` prefix.app.config_from_object('django.conf:settings', namespace='CELERY')# Load task modules from all registered Django app configs.app.autodiscover_tasks()@app.task(bind=True)def debug_task(self):print('Request: {0!r}'.format(self.request))Installation------------* Install the package* include spacyal.urls and spacyal.api_urls in your main url definition* ensure that you have a base template called base.html* run python manage.py migrate* and you should be good to go.. _Celery: http://www.celeryproject.org/.. _spaCy: https://www.spacy.io
acdh-spacytei
acdh-spacytei is a python package providing utility classes and functions to processing XML (TEI, TCF) encoded documents with/for spaCyInstallationpip install acdh-spacyteiLicensingAll code unless otherwise noted is licensed under the terms of the MIT License (MIT). Please refer to the file LICENSE in the root directory of this repository.History0.0.9 (2019-02-27)added functions to process prodigy output filespipline processes access model dir0.0.6 (2019-02-27)added a tokenize method to TeiReaderNE information written as rs-tags into TEIminor things0.0.6 (2019-02-27)added langid to install_requiresnew functionrecogito.recogito_dump_to_spacy_neraddedminor things0.0.3 (2019-02-25)minor things0.0.1 (2019-02-25)First version
acdh-tei-pyutils
acdh-tei-pyutilsUtilty functions to work with TEI Documentsinstallrunpip install acdh-tei-pyutilsusageparse an XML/TEI Document from and URL, string or file:fromacdh_tei_pyutils.teiimportTeiReaderdoc=TeiReader("https://raw.githubusercontent.com/acdh-oeaw/acdh-tei-pyutils/master/acdh_tei_pyutils/files/tei.xml")print(doc.tree)>>><Element{http://www.tei-c.org/ns/1.0}TEIat0x7ffb926f9c40>doc=TeiReader("./acdh_tei_pyutils/files/tei.xml")doc.tree>>><Element{http://www.tei-c.org/ns/1.0}TEIat0x7ffb926f9c40>write the current XML/TEI tree object to filedoc.tree_to_file("out.xml")>>>'out.xml'seeacdh_tei_pyutils/cli.pyfor further examplescommand line scriptsBatch process a collection of XML/Documents by adding xml:id, xml:base next and prev attributes to the documents root element run:add-attributes-g"/path/to/your/xmls/*.xml"-b"https://value/of-your/base.com"add-attributes-g"../../xml/grundbuecher/gb-data/data/editions/*.xml"-b"https://id.acdh.oeaw.ac.at/grundbuecher"Write mentions as listEvents into index-files:mentions-to-indices-t"erwähnt in "-i"/path/to/your/xmls/indices/*.xml"-f"/path/to/your/xmls/editions/*.xml"Write mentions as listEvents of index-files and copy enriched index entries into filesdenormalize-indices-f"../../xml/schnitzler/schnitzler-tagebuch-data-public/editions/*.xml"-i"../../xml/schnitzler/schnitzler-tagebuch-data-public/indices/*.xml"denormalize-indices-f"./data/*/*.xml"-i"./data/indices/*.xml"-m".//*[@key]/@key"-x".//tei:title[@level='a']/text()"denormalize-indices-f"./data/*/*.xml"-i"./data/indices/*.xml"-m".//*[@key]/@key"-x".//tei:title[@level='a']/text()"-bpmb2121-bpmb10815-bpmb50Register handle-ids and add them as tei:idno elements:add-handles-g"../../xml/grundbuecher/gb-data/data/editions/*.xml"-user"user12.3456-01"-pw"verysecret"-hixpath".//tei:publicationStmt"developinstall dev-dependencies:pip install -r requirements_dev.txtinstall local packagepip install -e .before commiting runflake8to check linting andcoverage run -m pytest -vto run the tests
acdh-transkribus-utils
acdh-transkribus-utilsA python package providing some utility functions for interacting with theTranskribus-APIInstallationpip install acdh-transkribus-utilsUsageAuthenticationSet Transkribus-Credentials as environment variables:[email protected]_PASSWORD=verysecret(or create a file calledenv.secretsimilar toenv.dummyand runsource export_env_variables.sh) you can pass in your credentials also as params e.g.importosfromtranskribus_utils.transkribus_utilsimportACDHTranskribusUtilstr_user=os.environ.get("TRANSKRIBUS_USER")tr_pw=os.environ.get("TRANSKRIBUS_PASSWORD")client=ACDHTranskribusUtils(user=tr_user,password=tr_pw)List all collectionscollections=client.list_collections()forxincollections[-7:]:print(x["colId"],x["colName"])# 188933 bv-play# 188991 Kasten_blau_45_11# 190357 acdh-transkribus-utils# 193145 palm# 195363 Österreichische Bundesverfassung: Datenset A# 196428 Österreichische Bundesverfassung: Datenset B# 196429 Österreichische Bundesverfassung: Datenset CList all documents from a given collectioncol_id=142911documents=client.list_docs(col_id)n=-3forxindocuments[n:]:print(x["docId"],x["title"],x["author"],x["nrOfPages"])# 950920 Kasten_blau_44_9_0050 Pfalz-Neuburg, Eleonore Magdalena Theresia von 1# 950921 Kasten_blau_44_9_0037 Pfalz, Johann Wilhelm Joseph Janaz von der 4# 950922 Kasten_blau_44_9_0239 Pfalz, Johann Wilhelm Joseph Janaz von der 1Download METS files from Collectionfromtranskribus_utils.transkribus_utilsimportACDHTranskribusUtilsCOL_ID=51052client=ACDHTranskribusUtils()client.collection_to_mets(COL_ID)# downloads a METS for each document in the given collection into a folder `./{COL_ID}client.collection_to_mets(COL_ID,file_path='./foo')# downloads a METS for each document in the given collection into a folder `./foo/{COL_ID}client.collection_to_mets(COL_ID,filter_by_doc_ids=[230161,230155])# downloads only METS for document with ID 230161 and 230155 into a folder `./{COL_ID}
acdh-uri-norm-rules
UriNormRulesSet of URI normalization rules used within theACDH-CD.Provides Python 3 and PHP bindings.Rules are stored as a JSON in theUriNormRules/rules.jsonfile.Installation & usagePythonInstall using pip3:pip3installacdh_uri-norm-rulesUse withfromAcdhUriNormRulesimportget_rules,get_normalized_uriprint(AcdhUriNormRules.get_rules())wrong_id="http://sws.geonames.org/1232324343/linz.html"good_id=get_normalized_uri(wrong_id)print(good_id)# "https://www.geonames.org/1232324343"PHPInstall using usingcomposer:composerrequireacdh-oeaw/uri-norm-rulesUsage withrequire_once 'vendor/autoload.php';print_r(acdhOeaw\UriNormRules::getRules());
acdh-wikidata-pyutils
acdh-wikidata-pyutilsUtility package to fetch data from Wikidatadevelopmentcreate virtual envpython -m venv venvand activate itsource venv/bin/activateinstall dev-dependenciespip install -r requirements_dev.txtinstall acdh-wikidata-pyutils locallypip install -e .run testscoverage run -m pytestusagefromacdh_wikidata_pyutilsimportWikiDataPersonitem=WikiDataPerson("https://www.wikidata.org/wiki/Q44331")person=item.get_apis_entity()print(person)# {'name': 'Schnitzler', 'first_name': 'Arthur', 'start_date_written': '1862-05-15', 'end_date_written': '1931-10-21', 'gender': 'male'}fromacdh_wikidata_pyutilsimportWikiDataPlaceitem=WikiDataPlace("https://www.wikidata.org/wiki/Q41329")place=item.get_apis_entity()print(place)# {'name': 'Linz', 'lat': 48.30583333333333, 'long': 14.286388888888888}
acdh-xml-pyutils
ACDH XML PyUtilsUtility functions to work with XMLFeaturesparse XML from files, strings or URLSprint/save parsed filesseetests/test_xml.pyfor usage examples
acdpnet
Network Core一个能够用简洁的方式来构建通信服务的轻量级框架。方法此项目已上传acdpnet · PyPI,快速安装可使用以下命令。pipinstallacdpnet-ihttps://pypi.org/simple构建服务基本框架fromacdpnetimportservicesapp=services.Tree()# extensionapp.run(('0.0.0.0',1035),token='ASDF')通过以上代码可以快速构建一个服务端,但仍然无法投入使用,在此之上还需要加上用户认证的功能。用户认证简单的用户创建方式app.idf.acessuid={'Name':'Password'}自定义用户认证classLogin(services.Idenfaction):defsetup(self):passdefIdenfy(self,account:str,key:str):ifkey==self.acessuid.get(account):returnTrueelse:returnFalseapp.idf=Login()在继承services.Idenfaction后,认证的方式请写在Idenfy方法中。之后覆盖app.idf。服务拓展完成了基本的框架后,就可以为服务端添加功能啦。通过函数拓展fromacdpnet.toolsimportConet# 引用此类是为了在写代码时出现代码提示,可不引用@app.command('command_name')defDosomething(conet:Conet):pass通过类拓展fromacdpnet.toolsimportConetclassServ:defcommand_name_1(conet:Conet):passdefcommand_name_2(conet:Conet):passapp.extension(Serv)在通过此方法拓展时,请勿在方法中添加self,同时方法的名称便是指令的名称。通过内置拓展添加此库中内置了一些预先写好的拓展,可以通过以下方式添加。fromacdpnet.extensionimporttransferapp.extension(transfer.Transfer)构建客户端基本方式fromacdpnetimportnodesapp=nodes.BasicNode('Name','Password')app.connect(('localhost',1035),token='ASDF')# do somethingapp.close()发送与接收发送命令及数据data={...}app.send('Your command',data)接收命令与数据resp=app.recv()# 由服务端决定返回 dict 或 list
acd-sdk
Merative Annotator for Clinical Data Python SDK Version 2.0.2OverviewThis Python SDK allows developers to programmatically interact with the following service:Service NameImported Class NameAnnotator for Clinical DataAnnotatorForClinicalDataV1PrerequisitesRefer to the Annotation for Clinical Data documentation:PrerequisitesSoftware Development KitsPython 3.7 or aboveInstallationTo install, usepippipinstall--upgradeacd-sdkMigrating from version 1.x.xThe release of version 2 of the Annotator for Clinical Data SDK introduces a Python package name change fromibm_whcs_sdktoacd_sdk. For the migration from 1.x.x:Runpip uninstall ibm-whcs-sdkRunpip install acd-sdk, and confirm that you have version 2 installed by runningpip show acd-sdk.In your application files that have a dependency onacd-sdk, update any import declarations fromibm_whcs_sdktoacd_sdk.Using the SDKFor general SDK usage information, please seethis linkQuestionsFor questions, refer to:Annotator for Clinical Data documentationAnnotator for Clinical Data Support pageIssuesIf you encounter an issue with the project, you are welcome to submit abug report.ContributingSeeCONTRIBUTING.LicenseThe Annotator for Clinical Data Python SDK is released under the Apache 2.0 license. The license's full text can be found inLICENSE.
acd-tools
Rockwell ACD Project File ToolsThe Rockwell ACD file is an archive file that contains all the files that are used by RSLogix/Studio 5000.It consists of a number of text files containing version information, compressed XML files containing project and tag information as well as a number of database files.Parsing the ACD fileThe exporting of the L5X file isn't complete, we are able to parse the data types, tags and programs into a Controller python object though.To get the Controller object and get the program/routines/rungs/tags/datatypes, use something like thisfromacd.export_l5ximportExportL5xcontroller=ExportL5x("../resources/CuteLogix.ACD","build/output.l5x").controllerrung=controller.programs[0].routines[0].rungs[0]data_type=controller.data_types[-1]tag_name=controller.tags[75].texttag_data_type=controller.tags[75].data_typeUnzipTo extract the file use the acd.unzip.Unzip class. This extracts the database files to a directory.fromacd.unzipimportUnzipunzip=Unzip('CuteLogix.ACD')unzip.write_files('output_directory')
acdumo
acdumoThis app is a simple implementation of theAccelerated Dual Momentuminvestment strategy. It queries a Yahoo Finance API for historical ticker price data, calculates ADM statistics, and suggests a strategy.Installation (command line)For simple command line use of this app, you can install with pip:pip3installacdumoorpip3install--useracdumoInstallation will require an extra step on macOS systems. Run the includedacdumo-install-certificommand.acdumo-install-certifiUsage (command line)To generate a report for the current date, simply run:acdumoOptional arguments can be used to generate reports for different dates or tickers:acdumo--helpusage: acdumo [-h] [--date <yyyy-mm-dd>] [--tickers <TIC> [<TIC> ...]] [--bonds <TIC>] [--frequency {monthly,weekly}] [<path/to/report/dir/>] Accelerated dual momentum positional arguments: <path/to/report/dir/> write a HTML report optional arguments: -h, --help show this help message and exit --date <yyyy-mm-dd> date of interest (default: today) --tickers <TIC> [<TIC> ...] tickers to use (default: SPY TLT VSS SCZ) --bonds <TIC> ticker representing bonds (default: TLT) --frequency {monthly,weekly} frequency of data to fetch (default: monthly)Installation (full app)To run the app locally, use the following procedure. By default it is configured to use a gmail account of your choice for account confirmation emails. The gmail account must be configured to allowless secure apps.gitclonehttps://github.com/anthony-aylward/acdumo.gitcdacdumo python3-mvenvvenvsourcevenv/bin/activate pip3install-e.# if on macOS, also run: python3 acdumo/install_certifi.pyexportFLASK_APP=acdumoexportFLASK_ENV=development mkdir-pinstance/protected python3config/__init__.py--email<gmailaddress>instance/ flaskdbupgrade flaskrunYou can then use a web browser to navigate to the app (by default at localhost:5000)
ace
ace is an implementation of the Alternating Conditional Expectation (ACE) algorithm[Breiman85], which can be used to find otherwise difficult-to-find relationships between predictors and responses and as a multivariate regression tool.The code for this project, as well as the issue tracker, etc. ishosted on GitHub. The documentation is hosted athttp://partofthething.com/ace.What is it?ACE can be used for a variety of purposes. With it, you can:build easy-to-evaluate surrogate models of data. For example, if you are optimizing input parameters to a complex and long-running simulation, you can feed the results of a parameter sweep into ACE to get a model that will instantly give you predictions of results of any combination of input within the parameter range.expose interesting and meaningful relations between predictors and responses from complicated data sets. For instance, if you have survey results from 1000 people and you and you want to see how one answer is related to a bunch of others, ACE will help you.The fascinating thing about ACE is that it is anon-parametricmultivariate regression tool. This means that it doesn’t make any assumptions about the functional form of the data. You may be used to fitting polynomials or lines to data. Well, ACE doesn’t do that. It uses an iteration with a variable-span scatterplot smoother (implementing local least squares estimates) to figure out the structure of your data. As you’ll see, that turns out to be a powerful difference.Installing itace is available in thePython Package Index, and can be installed simply with the following.On Linux:sudo pip install aceOn Windows, use:pip install aceDirectly from source:git clone [email protected]:partofthething/ace.git cd ace python setup.py installNoteIf you don’t have git, you can just download the source directly fromhere.You can verify that the installation completed successfully by running the automated test suite in the install directory:python -m unittest discover -bvUsing itTo use, get some sample data:fromace.samplesimportwang04x,y=wang04.build_sample_ace_problem_wang04(N=200)and run:fromaceimportmodelmyace=model.Model()myace.build_model_from_xy(x,y)myace.eval([0.1,0.2,0.5,0.3,0.5])For some plotting (matplotlib required), try:fromaceimportaceace.plot_transforms(myace.ace,fname='mytransforms.pdf')myace.ace.write_transforms_to_file(fname='mytransforms.txt')Note that you could alternatively have loaded your data from a whitespace delimited text file:myace.build_model_from_txt(fname='myinput.txt')WarningThe more data points ACE is given as input, the better the results will be. Be careful with less than 50 data points or so.DemoA combination of various functions with noise is shown below:Given just those points and zero knowledge of the underlying functions, ACE comes back with this:A longer version of this demo is available in theSample ACE Problemssection.Other detailsThis implementation of ACE isn’t as fast as the original FORTRAN version, but it can still crunch through a problem with 5 independent variables having 1000 observations each in on the order of 15 seconds. Not bad.ace also contains a pure-Python implementation of Friedman’s SuperSmoother[Friedman82], the variable-span smoother mentioned above. This can be useful on its own for smoothing scatterplot data.HistoryThe ACE algorithm was published in 1985 by Breiman and Friedman[Breiman85], and the original FORTRAN source code is available fromFriedman’s webpage.MotivationBefore this package, the ACE algorithm has only been available in Python by using the rpy2 module to load in the acepack package of the R statistical language. This package is a pure-Python re-write of the ACE algorithm based on the original publication, using modern software practices. This package is slower than the original FORTRAN code, but it is easier to understand. This package should be suitable for medium-weight data and as a learning tool.For the record, it is also quite easy to run the original FORTRAN code in Python using f2py.About the AuthorThis package was originated by Nick Touran, a nuclear engineer specializing in reactor physics. He was exposed to ACE by his thesis advisor, Professor John Lee, and used it in his Ph.D. dissertation to evaluate objective functions in a multidisciplinary design optimization study of nuclear reactor cores[Touran12].LicenseThis package is released under the MIT License,reproduced here.References[Breiman85](1,2)L. BREIMAN and J. H. FRIEDMAN, “Estimating optimal transformations for multiple regression and correlation,” Journal of the American Statistical Association, 80, 580 (1985).[Link1][Friedman82]J. H. FRIEDMAN and W. STUETZLE, “Smoothing of scatterplots,” ORION-003, Stanford University, (1982).[Link2][Wang04]D. WANG and M. MURPHY, “Estimating optimal transformations for multiple regression using the ACE algorithm,” Journal of Data Science, 2, 329 (2004).[Link3][Touran12]N. TOURAN, “A Modal Expansion Equilibrium Cycle Perturbation Method for Optimizing High Burnup Fast Reactors,” Ph.D. dissertation, Univ. of Michigan, (2012).[The Thesis]
aceagentaws
### DescriptionACE Agent helper resources.
aceagentlogger
### DescriptionSplunk related helper resources.
ace-api
Python Bindings for ACE REST APIExamplesConnect to a ServerSetting the default remote host:>>> import ace_api >>> server = 'ace.integraldefense.com' >>> ace_api.set_default_remote_host(server) >>> ace_api.ping() {'result': 'pong'}Setting the remote host for an Analysis class:>>> analysis = ace_api.Analysis('this is the analysis description') >>> analysis.set_remote_host('something.else.com').remote_host 'something.else.com'If your ACE instance is listening on a port other than 443, specify it like so::>>> ace_api.set_default_remote_host('ace.integraldefense.com:24443') >>> ace_api.default_remote_host 'ace.integraldefense.com:24443'Submit a File to ACE>>> path_to_file = 'Business.doc' >>> analysis.add_file(path_to_file) <ace_api.Analysis object at 0x7f23d57e74e0> >>> analysis.add_tag('Business.doc').add_tag('suspicious doc') <ace_api.Analysis object at 0x7f23d57e74e0> >>> analysis.submit() <ace_api.Analysis object at 0x7f23d57e74e0> >>> analysis.status 'NEW' >>> analysis.status 'ANALYZING' >>> analysis.status 'COMPLETE (Alerted with 8 detections)' >>> result_url = 'https://{}/ace/analysis?direct={}'.format(analysis.remote_host, analysis.uuid) >>> print("\nThe results of this submission can be viewed here: {}".format(result_url))The results of this submission can be viewed here:https://ace.integraldefense.com/ace/analysis?direct=137842ac-9d53-4a25-8066-ad2a1f6cfa17Submit a URL to Cloudphish>>> another_url = 'http://medicci.ru/myATT/tu8794_QcbkoEsv_Xw20pYh7ij' >>> cp_result = ace_api.cloudphish_submit(another_url) >>> cp_result['status'] 'NEW' >>> # Query again, a moment later: ... >>> cp_result = ace_api.cloudphish_submit(another_url) >>> cp_result['status'] 'ANALYZED' >>> cp_result['analysis_result'] 'ALERT' >>> result_url = 'https://{}/ace/analysis?direct={}'.format(ace_api.default_remote_host, cp_result['uuid']) >>> print("\nThe results of this submission can be viewed here: {}".format(result_url))The results of this submission can be viewed here:https://ace.integraldefense.com/ace/analysis?direct=732ec396-ce20-463f-82b0-6b043b07f941DocumentationACE's API documentation:View ACE's full documentation here:https://ace-analysis.readthedocs.io/en/latest/
ace-authorize
No description available on PyPI.
aceawslogger
No description available on PyPI.
ace-backend
No description available on PyPI.
acechmscoring
Failed to fetch description. HTTP Status Code: 404
acecm
ace-cookie-managerstreamlit-custom-componentStreamlit component that allows you to do work with cookiesInstallation instructionspipinstallacecmUsage instructionsimportstreamlitasstfromacecmimportsc,gcst.write(value)
ace_components
No description available on PyPI.
ace-cream
Build StatusPyPIWindowsThis project provides a wrapper program of Python for ACE algorithm implementation of Fortran.Install Binary DistributionCurrently, only 64-bit binary distribution is provided. Runpip install ace_creamto install the binary distribution.Platformpy3.6py3.7py2.7WindowsTTTMacOSTTLinuxTTTHow to buildYou neednumpyand fortran compiler to build from source.WindowsInstallVisual C++toolchain.Download MinGW-w64 fromsourceforge, which provides the necessary fortran compilerInstall MinGW-w64 and add{install_dir}\mingw64\binpath to environment variable (makegfortranaccessible from command line).(for conda environment) Add{install_dir}\Anaconda3\Scriptsto environment variable (makef2pyaccessible from command line).MacYou can use package manager to installgfortran(included within gnu compiler collection). For example, withHomebrewyou can usebrewinstallgccUbuntuTo installgfortran, use the default package manager:sudoapt-getinstallgfortranRunpython setup.py installfrom command line at the project root directory.How to useimportnumpyasnpfromace_creamimportace_cream# discrete case, binary symmetric channel with crossover probability 0.1x=np.random.choice([0,1],size=N_SIZE)n=np.random.choice([0,1],size=N_SIZE,p=[0.9,0.1])y=np.mod(x+n,2)# set both x(cat=0) and y(cat=-1) as categorical typetx,ty=ace_cream(x,y,cat=[-1,0])# continuous casex=np.random.uniform(0,np.pi,200)y=np.exp(np.sin(x)+np.random.normal(size=200)/2)tx,ty=ace_cream(x,y)Resultimagechange logv0.1 initial commitv0.2 modify to relative import in__init__.pyv0.3 add support for multiple columns of x and other directions of transformationv0.4 addf_mappingfunction and unittests for this functionLicenseApache License Version 2.0Referencehttps://en.wikipedia.org/wiki/Alternating_conditional_expectations
ace-database
Failed to fetch description. HTTP Status Code: 404
acedatacloud-scaffold
Ace Data Cloud ScaffoldInstall:pip install acedatacloud-scaffoldSample:fromacedatacloud_scaffoldimportBaseControllerasControllerfromacedatacloud_scaffoldimportBaseHandlerimportjsonclassHandler(BaseHandler):asyncdefget(self,id=None):result={'value':id}self.write(json.dumps(result))controller=Controller()controller.add_handler(r'/test/(.*)',Handler)controller.start()
ace-distributions
No description available on PyPI.
aceditor
Ace EditorACE full stack editor with JavaScript and Python CGI.FeaturesCommon Keyboard ShortcutsQuick Access ToolbarEvent Status BarFile Bookmarks with Import/ExportFile History for BackupLine Separator ChangerPrint Function, Log FilePreviewDemo:https://foxe6.github.io/aceditor/ExampleSee/aceditor/aceditor/example/.
aced-submission
submissionUtilities to upload metadata and files to ACED's Gen3 instanceSetuppython3 -m venv venv source venv/bin/activate pip install -r requirements.txt pip install -e .Usesee etl podDistributionPyPi# update pypi # pypi credentials - see https://twine.readthedocs.io/en/stable/#environment-variables export TWINE_USERNAME= # the username to use for authentication to the repository. export TWINE_PASSWORD= # the password to use for authentication to the repository. # this could be maintained as so: export $(cat .env | xargs) rm -r dist/ python3 setup.py sdist bdist_wheel twine upload dist/*
ace-elispot
ACE Configurator for ELISpotACE facilitates (1) generation of ELISpot configurations (peptide-pool assignments) using a deep learning approach to cluster similar peptides and (2) deconvolution of pool spot counts for identification of immunogenic peptides.01. Installation01-1. Standalone Graphical User Interface (Recommended)Please note that the ACE GUI software will take a long time to load (~30 seconds :coffee:). We also recommend that you haveGoogle Chromeinstalled on your machine.Operating SystemLinkVersionMacDownloadv0.1.1.0 (latest)Windows 10Downloadv0.1.1.0 (latest)For Windows versions, first unzip the file and look for an application file calledACEinside the unzipped foler. Previous versions of ACE are availablehere.01-2. Python PackageACE is available onPyPIpip install ace-elispotYou can also download a specific version of ACE fromhere.Subsequently install the ACE package using pip:pip install ace-elispot-<version>.tar.gzDependenciespython3 (>= 3.10)pandas (>=1.5.2)numpy (>=1.23.1)ortools (9.3.10497)torchtransformers (==4.30.2)scikit-learnopenpyxlgolfy (>=2.5.0)levenshtein02. UsageACE is available as a command-line interface after you install the python package:usage: ace [-h] [--version] {generate,deconvolve,verify} ... ACE Configurator for ELISpot. positional arguments: {generate,deconvolve,verify} ACE sub-commands. generate Generates an ELISpot experiment configuration. deconvolve Deconvolve hit peptide IDs given read-outs from an ELISpot experiment. verify Verifies whether an ELISpot assignment satisfies all ACE constraints. options: -h, --help show this help message and exit -v, --version show program version number and exitRead the full documentation on the python package athttps://pirl-unc.github.io/ace/03. CitationIf you use ACE in a publication, please cite ourpreprintdescribing ACE.
acefile
This single-file, pure python 3, no-dependencies implementation is intended to be used as a library, but also provides a stand-alone unace utility. As mostly pure-python implementation, it is significantly slower than native implementations, but more robust against vulnerabilities.This implementation supports up to version 2.0 of the ACE archive format, including the EXE, DELTA, PIC and SOUND modes of ACE 2.0, password protected archives and multi-volume archives. It does not support writing to archives. It is an implementation from scratch, based on the 1998 document titled “Technical information of the archiver ACE v1.2” by Marcel Lemke, using unace 2.5 and WinAce 2.69 by Marcel Lemke as reference implementations.For more information, API documentation, source code, packages and release notifications, refer to:https://www.roe.ch/acefilehttps://apidoc.roe.ch/acefilehttps://github.com/droe/acefilehttps://pypi.python.org/pypi/acefilehttps://twitter.com/droethlisberger
ace-hunter
ace-hunterace-hunteris primarily a command line tool for performing hunt validation in ACE environments. It's derived directly from the ACE Hunting System and can serve has a drop in replacement with some small changes to the ACE Hunting System.Splunk hunts are the only hunts currently supported.Installpip install ace_hunterYou could also git clone this repo andpython3 setup.py installinside whatever python environment you wish.NOTE: I've only tested this in python3.9 but it should work for python>=3.7.CLI ToolA tool calledhuntis made available on the command line after install. For legacy reasons the tool can also be found underace-hunt.$hunt-husage: hunt [-h] [-d] {list-types,lt,list,l,verify,v,execute,e,config-query,cq,configure,c} ...A hunting tool for ACE ecosystems.positional arguments:{list-types,lt,list,l,verify,v,execute,e,config-query,cq,configure,c}list-types (lt) List the types of Hunts configured.list (l) List the available hunts. The format of the output is E|D type:name - description E: enabled D: disabledverify (v) Verifies that all configured hunts are able to load.execute (e) Execute a hunt with the given parameters.config-query (cq) Query the Hunter configuration.configure (c) Configure Hunter requirements.optional arguments:-h, --help show this help message and exit-d, --debug Turn on debug logging.ConfigureYou will need to configure ace-hunter to work with your Splunk environment, your splunk hunt rules, and optionally your ACE environment.Configuration items can be overridden on a system and user level. Config items take the following precedence, where items found later override earlier ones:Built in defaults.ACE settings at/opt/ace/etc/saq.hunting.ini.System level settings at/etc/ace/hunting.ini.User level settings at~/.config/ace/hunting.ini.Special Environment VariablesMost of theace-hunterconfiguration flexibility is so it may be dropped directly into ACE or for later convenience as much lighter ace-hunting docker container.Basic CLI Hunting ConfigurationBelow is an example of the minimum requirements for Splunk hunting withace-hunter.[splunk] ; ex. uri = https://your.splunk.address uri = ; timezone of your splunk server. ex: US/Eastern timezone = username = password = ; Can supply path to CA cert, yes for using system certs, no to turn off. ssl_verification = [SSL] ; SSL section is for submitting results to ACE. ; The ca_chain_path will be attempted if supplied. ; Next, systems certs used unless verify_ssl set to False. verify_ssl = ca_chain_path = [hunt_type_splunk] ; Optionally specify the base location all rule directories ; will be relative to. ; Example showing that current user references will be expanded: ;detection_dir = ~/detections ; This is for convenience. SAQ_HOME or other settings can also be used. detection_dir = ; Comma sep list pointing to your different splunk rule dirs. rule_dirs = hunts/splunk/hippo,hunts/splunk/catEasy User Level ConfigurationYou can easily override whatever config settings you need with thehunt configureAPI.Ex: save your rules directories:➜ hunt configure hunt_type_splunk.rule_dirs -v 'hunts/splunk/hippo,hunts/splunk/cat'2022-02-04 14:49:23 MacBook-Pro ace_hunter.config[1141] INFO saving passed value to hunt_type_splunk.rule_dirs to /Users/sean/.config/ace/hunting.ini2022-02-04 14:49:23 MacBook-Pro ace_hunter.config[1141] INFO saved configuration to: /Users/sean/.config/ace/hunting.iniEx: save your password:➜ hunt configure splunk.passwordEnter value for splunk.password:2022-02-04 14:50:56 MacBook-Pro ace_hunter.config[1565] INFO saving passed value to splunk.password to /Users/sean/.config/ace/hunting.ini2022-02-04 14:50:56 MacBook-Pro ace_hunter.config[1565] INFO saved configuration to: /Users/sean/.config/ace/hunting.iniIf thehunttool creates or edits the user level config at~/.config/ace/hunting.inithe file will be made RW for the current user only.TODOAllow proxy settings to be configurable for flexibility. Use use environment variables as needed for now.
ac-electricity
acelectricity moduleThe aim of this project is to understand the rudiments of linear AC electrical circuits and their functioning.PrerequisitesPython : version 3All operating systemsLibraries :numpymatplotlib.pyplotmatplotlib.widgetsInstalling acelectricity python moduleFrom Pypi repository :https://pypi.org/project/ac-electricitypip install ac-electricityBasic electrical quantitiesVoltage (volt)Current (ampere)Impedance Z (ohm)Admittance Y (siemens)Active power P (watt)Reactive power Q (var)Apparent power S (VA)Electrical lawsKirchhoff’s current lawKirchhoff’s voltage lawOhm's lawImpedances, admittances in series and parallelVoltage divider and current dividerMillman's theoremJoule's first law (S=I²Z)Power law (S=V.I*)Others featuresBode plot (with cursors and sliders)User-defined transfer function H(jω)Digital filter frequency responseAC Circuit diagramCircuit should only contain :independent sine voltage sourceindependent sine current sourceresistors, inductors, capacitorsExample circuitVL <--------------------- L1 +------+ _ _ _ --->--| R1 |---/ \/ \/ \-----+-------+ IL +------+ | | | | ^ IR v v IC | | | ^ | +---+ | C1 | | | | ----- | Vin | |R2 | ----- | Vout | | | | | | +---+ | | | | | | | -------------------------------+-------+Datas :Vin = 5 VrmsInductor : L1 = 100 mH ; R1 = 180 ΩR2 = 2.2 kΩ ; C1 = 330 nFWhat are the IL, IR, IC currents ?What are the VL, Vout voltages ?What is the frequency response Vout/Vin ?>>>fromacelectricityimport*>>>Vin=Voltage(5)# Vrms>>>Zr1=Impedance(r=180)>>>Zl1=Impedance(l=0.1)>>>Zr2=Impedance(r=2200)>>>Zc1=Impedance(c=330e-9)>>>Zeq1=1/(1/Zr2+1/Zc1)# impedances in parallel>>># or Zeq1 = Zr2//Zc1>>>Zeq1Compleximpedance(Ω):[email protected]>>>Zeq=Zr1+Zl1+Zeq1# impedances in series>>>Zeq(2000)# @ 2000 Hz(206.11818300356995+1018.3560443060462j)>>>IL=Vin/Zeq# Ohm's law>>>IL.properties(2000)Frequency(Hz):2000Angularfrequency(rad/s):12566.4Complexcurrent:0.000954663-0.00471665jAmplitude(Arms):0.00481229Amplitude(A):0.00680561Amplitude(dBAref1Arms):-46.3529621108357Phase(degrees):-78.5577Phase(radians):-1.37109i(t)=0.00680561×sin(12566.4×t-1.371091)>>>Vout=IL*Zeq1# Ohm's law>>>VL=Vin-Vout# Kirchhoff’s voltage law>>>IC=Vout/Zc1# Ohm's law>>>IR=IL-IC# Kirchhoff’s current law>>>H=Vout/Vin# transfer function>>># draw Bode plot and save datas>>>H.bode(title='Vout/Vin transfer function',filename='h.csv')>>>IL.bode(magnitude_unit='default',yscale='linear',title='IL current')>>>Zeq.bode(yscale='log',title='Zeq frequency response')>>>Zeq.bode_imag(title='Zeq reactance frequency response')>>>show()Zoom and data cursors :Impedances and admittances>>>Yr2=1/Zr2>>>Yc1=1/Zc1>>>Yc1Complexadmittance(S):[email protected]>>>1/Yc1Compleximpedance(Ω):[email protected]>>>1/(Yr2+Yc1)Compleximpedance(Ω):[email protected]>>>Zr2*(Zc1/(Zr2+Zc1))Compleximpedance(Ω):[email protected] class>>>law=Law()>>># voltage divider>>>Vout=law.VoltageDivider(vtotal=Vin,z=Zr2//Zc1,z2=Zr1+Zl1)>>>VoutComplexvoltage(Vrms):[email protected]>>>Vin*(Zeq1/Zeq)Complexvoltage(Vrms):[email protected]>>># Millman's theorem>>>gnd=Voltage(0)>>>Vout=law.Millman(v_z=[(Vin,Zr1+Zl1),(gnd,Zr2),(gnd,Zc1)])>>>VoutComplexvoltage(Vrms):[email protected]>>>(Vin/(Zr1+Zl1))/(1/(Zr1+Zl1)+1/Zr2+1/Zc1)Complexvoltage(Vrms):[email protected]>>>Vout/VinRatio:[email protected]>>># current divider>>>IC=law.CurrentDivider(itotal=IL,z=Zc1,z2=Zr2)>>>ICComplexcurrent(Arms):[email protected]>>>IL*Zr2/(Zr2+Zc1)Complexcurrent(Arms):[email protected] power, Joule's first law>>>S=IL*Vin# input source complex power>>>S.properties(1000)Frequency(Hz):1000Angularfrequency(rad/s):6283.19Complexpower:0.0655242+0.0392254jActivepowerP(W):+0.0655242ReactivepowerQ(var):+0.0392254ApparentpowerS(VA):0.0763678Phase(degrees):+30.9064Phase(radians):+0.539419PowerfactorPF:0.8580073402000852Activepower(dBWref1W):-11.835985322667693>>>Sr1=law.Joule(z=Zr1,i=IL)>>>Sr1Complexpower(W):[email protected]>>>Zr1*IL*ILComplexpower(W):[email protected]>>>Sr2=law.Joule(z=Zr2,v=Vout)>>>Sr2Complexpower(W):[email protected]>>>Sl1=Zl1*IL*ILComplexpower(W):[email protected]>>>Sc1=Zc1*IC*ICComplexpower(W):[email protected]>>>Sr1+Sr2+Sc1+Sl1Complexpower(W):[email protected] analysis>>>Vin.RMS=10>>>Vin.phase=45>>>Zr1.r=100>>>Zl1.l=0.22>>>Zr2.r=1000>>>Zc1.c=100e-9>>>H.bode(title='Vout/Vin transfer function',filename='h2.csv')>>>IL.bode(magnitude_unit='default',yscale='linear',title='IL current')>>>show()>>>forZr2.rin[10,100,1e3,1e4]:H.bode(title="Vout/Vin with R2={}Ω".format(Zr2.r))>>>show()Add matplotlib widgets : slider and button>>>fig,ax1,ax2,mag,ph=H.bode(title='Vout/Vin transfer function')>>>fig.subplots_adjust(left=0.15,bottom=0.35,right=0.85,top=0.9)>>>ax_r1=plt.axes([0.15,0.20,0.4,0.03])>>>ax_l1=plt.axes([0.15,0.15,0.4,0.03])>>>ax_r2=plt.axes([0.15,0.10,0.4,0.03])>>>ax_c1=plt.axes([0.15,0.05,0.4,0.03])>>>slider_r1=Slider(ax_r1,'r1 (Ω)',10,300,valinit=Zr1.r,valstep=1)>>>slider_l1=Slider(ax_l1,'l1 (H)',0.01,1,valinit=Zl1.l,valstep=0.01)>>>slider_r2=Slider(ax_r2,'r2 (Ω)',1e3,1e4,valinit=Zr2.r,valstep=10)>>>slider_c1=Slider(ax_c1,'c1 (nF)',100,1000,valinit=Zc1.c*1e9,valstep=1)>>>defupdate(val):Zr1.r=slider_r1.valZl1.l=slider_l1.valZr2.r=slider_r2.valZc1.c=slider_c1.val*1e-9xvalues=mag.get_xdata()# H.db() method, according to H.bode() parameters :# magnitude_unit='db' (default)# help(H) for more informationmagnitudes=[H.db(x)forxinxvalues]# H.phase_deg() method, according to H.bode() parameters :# phase_unit='degrees' (default)phases=[H.phase_deg(x)forxinxvalues]mag.set_ydata(magnitudes)ph.set_ydata(phases)>>>slider_r1.on_changed(update)>>>slider_l1.on_changed(update)>>>slider_r2.on_changed(update)>>>slider_c1.on_changed(update)>>>ax_button=plt.axes([0.7,0.10,0.15,0.06])>>>button=Button(ax_button,'Autoscale')>>>defautoscale(event):ax1.relim()ax1.autoscale_view()ax2.relim()ax2.autoscale_view()>>>button.on_clicked(autoscale)>>>show()User-defined transfer functionExample 1 : second order band-pass filter>>>fromacelectricityimport*>>># static gain, damping value, normal angular frequency>>>a,z,wn=10,0.1,1000*2*math.pi>>>H=Ratio(fw=lambdaw:a*(2*z*1j*w/wn)/(1+2*z*1j*w/wn-(w/wn)**2))>>>H.bode(filename='H.csv')>>>a,z,wn=100,0.5,10000>>>H.bode(filename='H2.csv')>>>show()or :>>>a,z,wn=10,0.1,1000*2*math.pi>>>H=Ratio.transfer_function(numerator=[0,a*(2*z*1j/wn)],denominator=[1,2*z*1j/wn,-1/wn**2])>>>H.bode(filename='H3.csv')>>>a,z,wn=100,0.5,10000>>># new instance>>>H=Ratio.transfer_function(numerator=[0,a*(2*z*1j/wn)],denominator=[1,2*z*1j/wn,-1/wn**2])>>>H.bode(filename='H4.csv')>>>show()Example 2 : cascaded series, parallel filters>>>fromacelectricityimport*>>># first order low-pass filter>>>wn=10000>>>Hlp=Ratio(fw=lambdaw:1/(1+1j*w/wn))>>># first order high-pass filter>>>Hhp=Ratio(fw=lambdaw:1/(1+1j*wn/w))>>>Hs=Hlp*Hhp# cascaded series filters>>>Hs.bode()>>>Hp=Hlp+Hhp# parallel filters>>>Hp.bode()>>>show()Example 3 : linear control system+ +------+ -->---(X)---| G(w) |----+--->-- - | +------+ | | | | +------+ | +----| H(w) |-<--+ +------+>>>fromacelectricityimport*>>># feedforward transfer function>>># first order low-pass filter>>>wn=10000>>>G=Ratio.transfer_function([1],[1,1j/wn])>>># feedback transfer function>>>H=Ratio.transfer_function([10])# constant>>># open-loop transfer function>>>Hopenloop=G*H>>>Hopenloop.bode()>>># closed-loop transfer function>>>Hcloseloop=G/(1+Hopenloop)>>>Hcloseloop.bode()>>>show()Digital filter frequency responsey(n) = 0.1x(n) +1.6y(n-1) -0.7y(n-2)>>>fromacelectricityimport*>>>fs=100000# sampling rate (Hz)>>>H=Ratio.digital_filter(fs=fs,b=[0.1],a=[1,-1.6,0.7])>>>H.bode(xmin=0,xmax=fs/2,xscale='linear',title='IIR digital filter')>>>show()y(n) = (x(n)+x(n-1)+...+x(n-7))/8>>>fromacelectricityimport*>>>fs=10000# sampling rate (Hz)>>>H=Ratio.digital_filter(fs=fs,b=[1/8]*8)>>>H.bode(xmin=0,xmax=fs/2,xscale='linear',magnitude_unit='default',title='FIR digital filter')>>>show()Custom default frequency>>>fromacelectricityimport*>>>Yc=Admittance(c=220e-6)>>>YcComplexadmittance(S):[email protected]>>>ElectricalQuantity.DEFAULT_FREQUENCY=50>>>YcComplexadmittance(S):0+0.069115j@50Hz>>>1/YcCompleximpedance(Ω):0-14.4686j@50HzGoodiesIdeal filters>>>fromacelectricityimport*>>>deflp(w):wn=10000return1ifw<wnelse0.001>>>defhp(w):wn=1000return10ifw>wnelse0.001>>>defbp(w):wn1,wn2=1000,10000return1ifwn2>w>wn1else0.001>>>defnotch(w):wn1,wn2=4000,5000return0.001ifwn2>w>wn1else1>>>defallpass(w):wn=10000return1jifw>wnelse-1j>>>Hlp=Ratio(fw=lp)>>>Hhp=Ratio(fw=hp)>>>Hbp=Ratio(fw=bp)>>>Hnotch=Ratio(fw=notch)>>>Hallpass=Ratio(fw=allpass)>>>Hallpass.bode(title='Ideal all-pass filter')>>>(Hhp+Hlp).bode(title='Ideal filter Bode plot')>>>show()Note : remember that ideal filters are not realizable (not causal).Asymptotic Bode diagram>>>deflp1(w):# first order low-pass filter 1/(1+1j*w/wn)# asymptotic approximationwn=1000return1ifw<wnelse1/(1j*w/wn)>>>deflp2(w):wn=30000return1ifw<wnelse1/(1j*w/wn)>>>Hlp1=Ratio(fw=lp1);Hlp2=Ratio(fw=lp2)>>>(Hlp1*Hlp2).bode(title='Asymptotic Bode plot')>>>show()Advantages and limitationsThis module manages basic arithmetic operations+ - * /as well as//which designates two impedances in parallel.The dimensional homogeneity is checked :>>>V3=V1-V2+I3# V+A -> ErrorTypeError:Voltageexpected>>>I1=Current(2)>>>I=I1+0.5# A+number -> ErrorTypeError:Currentexpected>>>I2=Current(0.5,phase=30)>>>I=I1+I2>>>IComplexcurrent(Arms):[email protected]>>>I=5*I2-V1/Z1+I3The result of any operation must give a quantity whose unit is one of : V, A, Ω, S, W (or ratio).Otherwise, you will get an error :>>>Z1/V1# Ω/V -> 1/A -> ErrorTypeError>>>U2*(Z3/(Z2+Z3))# V*(Ω/Ω) -> V*() -> V>>>U2*Z3/(Z2+Z3)# V*Ω -> ErrorTypeError>>>S=V1*(V1/Z1)# V*(V/Ω) -> V*A -> W>>>S=V1*V1/Z1# V*V -> ErrorTypeErrorSee alsohttps://pypi.org/project/dc-electricity
acellera-acedock
No description available on PyPI.
acellera-acegen
No description available on PyPI.