package
stringlengths
1
122
pacakge-description
stringlengths
0
1.3M
acellera-acemodel
No description available on PyPI.
acellera-acemoietysearch
No description available on PyPI.
acellera-aceprep
No description available on PyPI.
acellera-aceprofiler
No description available on PyPI.
acellera-acerescore
No description available on PyPI.
acellera-adaptivesampling
No description available on PyPI.
acellera-bindscope
No description available on PyPI.
acellera-create-pm-repo
No description available on PyPI.
acellera-crypticscout
No description available on PyPI.
acellera-deepsite
No description available on PyPI.
acellera-deltadelta
No description available on PyPI.
acellera-dockingprotocols
No description available on PyPI.
acellera-duck
No description available on PyPI.
acellera-energyforcecalculators
No description available on PyPI.
acellera-envlicenses
# EnvLicensesThis package allows you to inspect a conda environment for licenses of all installed packages## Installingpip install acellera-envlicenses## Runningenvlicenses /home/user/miniconda3/envs/myenv/
acellera-func2argparse
Failed to fetch description. HTTP Status Code: 404
acellera-generative
No description available on PyPI.
acellera-glimpse
No description available on PyPI.
acellera-htmd
[![Build Status](https://travis-ci.org/Acellera/htmd.svg?branch=master)](https://travis-ci.org/Acellera/htmd) [![Language Grade: Python](https://img.shields.io/lgtm/grade/python/g/Acellera/htmd.svg?logo=lgtm&logoWidth=18)](https://lgtm.com/projects/g/Acellera/htmd/context:python) [![Conda](https://anaconda.org/acellera/htmd/badges/version.svg)](https://anaconda.org/acellera/HTMD) <!—[![Build status](https://ci.appveyor.com/api/projects/status/m1bxrop34b2qw68x/branch/master?svg=true)](https://ci.appveyor.com/project/acelleraci/htmd/branch/master)—># HTMD: Programming Environment for Molecular Discovery [HTMD](https://www.htmd.org) (acronym for High-Throughput Molecular Dynamics) is a programmable, extensible platform written in Python. It provides a complete workspace for simulation-based discovery through molecular simulations while aiming to solve the data generation and analysis problem as well as increase reproducibility.## Licensing HTMD Community Edition is free to use for non-profit work. Contact Acellera [www.acellera.com/contact](https://www.acellera.com/contact/) for information on the full version HTMD Pro or if you need a different license.## Download HTMD### Using released versions HTMD is distributed through conda package manager. The instructions for downloading HTMD can be found in [https://software.acellera.com/download.html](https://software.acellera.com/download.html).### Using this repository If you want to use this repository, we recommend to still download a released version of HTMD to have all dependencies and then set PYTHONPATH to the git directory.## HTMD Documentation and User Guide For HTMD Documentation, please visit: [https://software.acellera.com/docs/latest/htmd/api.html](https://software.acellera.com/docs/latest/htmd/api.html).For a User Guide (easy to start examples), please visit: [https://software.acellera.com/docs/latest/htmd/tutorials.html](https://software.acellera.com/docs/latest/htmd/tutorials.html)## Support and DevelopmentPlease report bugs via [GitHub Issues](https://github.org/acellera/htmd/issues).HTMD is an open-source software and we welcome contributions from the community. For more information on how to contribute to HTMD, please visit: [https://software.acellera.com/docs/latest/htmd/developers/howto.html](https://software.acellera.com/docs/latest/htmd/developers/howto.html)## Citing HTMDIf you use HTMD in your publication please cite:Stefan Doerr, Matthew J. Harvey, Frank Noé, and Gianni De Fabritiis.HTMD: High-throughput molecular dynamics for molecular discovery.Journal of Chemical Theory and Computation,2016,12(4), pp 1845–1852. [doi:10.1021/acs.jctc.6b00049](http://pubs.acs.org/doi/abs/10.1021/acs.jctc.6b00049)
acellera-kdeep
No description available on PyPI.
acellera-kdeeptrainer
No description available on PyPI.
acellera-ligann
No description available on PyPI.
acellera-ligdream
No description available on PyPI.
acellera-membranebuilder
No description available on PyPI.
acellera-msmbuilder
No description available on PyPI.
acellera-parameterize
No description available on PyPI.
acellera-pathwaymap
No description available on PyPI.
acellera-pdfreport
No description available on PyPI.
acellera-playmoleculeweb
No description available on PyPI.
acellera-plexview
No description available on PyPI.
acellera-pmbuilder
No description available on PyPI.
acellera-proteinprepare
No description available on PyPI.
acellera-rdock-api
No description available on PyPI.
acellera-simplerun
No description available on PyPI.
acellera-skeledock
No description available on PyPI.
acellera-sygmund
No description available on PyPI.
acellera-systembuilder
No description available on PyPI.
acellera-torchmd-net
No description available on PyPI.
ace-magnetometer
ACE magnetometer: Load and plotLoad and Plot ACE satelite magnetometer data for Python.PythonGet data from FTP siteftp://mussel.srl.caltech.edu/pub/ace/browse/MAG16secautomatically: ACE magnetometer 16 second cadence by date:DownloadACE.py2012-02-03~/dataLoad and Plot ACE magnetometer dataPlotACE.py2012-02-03~/dataMatlabMatlab is not regularly used.PlotACE()
ace-metrics
Metrics for the ACE EcosystemThis library is intended to help with the calculation and management of metrics as they pertain to any data produced by the ACE ecosystem. As of now, the library meets a base set of use cases for answering questions that arose during the initial organic operational development that took place around ACE's development. So it's certainly not the end-all-be-all but hopefully, it can serve as a base for further statistical analysis, metricization, analytics, and real-time presentations.Updates12/11/2020: Graphing functions were added, as well as, some new graphs for alerts, users, alert type, and events.Alert MetricsThe following statistics are available for alert based metrics and are used over and over again.>>>frommetrics.alertsimportVALID_ALERT_STATS>>>frommetrics.alertsimportFRIENDLY_STAT_NAME_MAP>>>>>>forstatinVALID_ALERT_STATS:...print(FRIENDLY_STAT_NAME_MAP[stat])...TotalOpenTimeAverageTimetoDispositionQuickestDispositionSlowestDispositionStandardDeviationforTimetoDispositionAlertQuantitiesbyDispositionAs of now, all of the above statistics are calculated by month and disposition. These stats can be calculated on any group of ACE alerts passed in a pandas dataframe. Themetrics.alerts.get_alerts_between_datesfunction can be used to get a group of alerts you're interested in.As of now, functionality exists to view any or all of these alert based stats from the viewpoint of users and alert types. All this means is that alerts are filtered down to only the alerts that apply during a given date range for respective alert types and/or users and then the VALID_ALERT_STATS are calculated for that set of alerts. Functionality related to viewing these stats as they relate to users and alert types are store in respective files in themetrics/alertsdirectory.The following independent alert based tables are also available:Operating Hours Summary:Essentially, a high-level view of how an ACE based team is performing during the three different categories of operational time. From the function definition:Summarize the overall cycle-time averages, the standard deviation in cycle times, and the alert quantities observed over the alerts passed when organized by month and placed in the respective operating hour category. Categories are: business hours, weekends, weeknightsOverall Alert Cycle-Time Summary:Generate an overall statistical summary for alerts by month. Similar to the hours of operation summary but not broken up into time categories. From the function definition:Organize alerts by month and then summarize the business hour and real hour cycle-time averages, the standard deviation in the business hour and real hour cycle-times, and the alert quantities observed over the alerts passed.Total Alert Type Quantities:A straight up count of alerts by type, between two dates.Business HoursFor time-based statistics, you can also calculate based on business hours. When business hours are applied, only the time inside of business hours is counted when calculating time-based statistics. You can define business hours as needed withmetrics.alerts.define_business_time().The default business start hour is 0600 hours, the default business end hour is 1800 hours. The default time zone is US/Eastern. Default holidays are defined atmetrics.alerts.SiteHolidays. Holidays are the only defaults that currently can't be overriden. The GUI uses theSLAconfig section to override start hour, end hour, and timezone.Expectmetric generation to take about ten times as long when applying business hours. Every alert time field has to be modified before the stats are calculated.Event MetricsCurrently, an event and incident table summary is available. For the GUI and CLI, a count of emails per event is appended to each event, but this is a separate function call that's not necessary if you don't care about emails per event/incident.CLIAccess to this library is made available on the CLI atace metrics.Command Line Instructionsusage: ace metrics [-h] [-so {json,csv,ascii_table,print}] [-fo {json,xlsx}] [-f FILENAME] [-c COMPANIES] [-bh BUSINESS_HOURS] [-s START_DATETIME] [-e END_DATETIME] {alerts,events} ... positional arguments: {alerts,events} alerts alert based metrics events event based metrics. With no arguments will return all events optional arguments: -h, --help show this help message and exit -so {json,csv,ascii_table,print}, --stdout-format {json,csv,ascii_table,print} desired standard output format. ~~~ NOTE: 'print' (the default) will also summarize large tables. Use 'ascii_table' to avoide that. -fo {json,xlsx}, --fileout-format {json,xlsx} desired file output format. Default is xls. -f FILENAME, --filename FILENAME The name of a file to write results to. -c COMPANIES, --company COMPANIES A list of company names to gather metrics for. Default is all defined companies. -bh BUSINESS_HOURS, --business-hours BUSINESS_HOURS Use business hours for all time based stats. Set like start_hour,end_hour,time_zone. Example: 6,18,US/Eastern -s START_DATETIME, --start_datetime START_DATETIME The start datetime data is in scope. Format: YYYY-MM- DD HH:MM:SS. Default: 7 days ago. -e END_DATETIME, --end_datetime END_DATETIME The end datetime data is in scope. Format: YYYY-MM-DD HH:MM:SS. Default: now.ExamplesGet the overall alert counts, by disposition, from '2020-06-01 00:00:00' to now. Print the results to an ASCII table.ace metrics -so ascii_table -s '2020-06-01 00:00:00' alerts --alert_countSame thing but return the json representation.ace metrics -so json -s '2020-06-01 00:00:00' alerts --alert_count | jq '.'Generate all alert based statistic tables for the user 'jdoe', and print the results as ASCII tables.ace metrics -so ascii_table -s '2020-06-01 00:00:00' alerts users -u jdoe --all-statsThe following command will output the calculated metrics in json. Business hours will be applied to time-based calculations and all statistics for alerts by disposition and month between '2020-06-01 00:00:00' and now will be calculated. Additionally, all alert statistics for the alert type 'mailbox' will be calculated during the same date range with business hours applied.ace metrics -fo json -bh 6,18,US/Eastern -s '2020-06-01 00:00:00' alerts --all-stats types -t mailbox --all-statsGUIAlmost all of the metrics that are available on the CLI are available through the GUI with a few exceptions.User MetricsBy default, users only have access to their metrics. Users needing access to other users statistics, through the GUI, can be added to the following configuration item:$aceconfiggui.full_metric_access[gui]full_metric_access=1,3Thefull_metric_accessconfig item expects a comma separated list of user IDs that can get all stats through the GUI.Exporting MetricsFrom the GUI, you can export metrics to an XLSX spreadsheet or to JSON documents.XLSX exportBecause of limitations with a popular XLSX application, the names of data tables are heavily sanitized before being written to tabs on the resulting XLSX sheet. As a result of this, the first tab on every XLSX spreadsheet will be a table that shows the mapping from tab names to original pivot table names.JSONWhen JSON export is selected, all tables are converted to JSON, and added to a tar.gz archive. Names are mostly preserved but special characters that can cause problems when used in filenames are replaced with '-'.CompaniesThe ability to select ACE data, where the data belongs to a specific company, is only made available if more than one company is defined.Library StructureCurrently, the project is structured so that every directory in the root$SAQ_HOME\metricsdirectory is an ACE database table, data source, or data target. As of now, metrics are calculated around theace.alertsandace.eventsdatabase tables and their relevant mapping tables. Any additional functionality should follow this structure. For instance, there is a desire to add metrics aroundace.observables, which will happen and will likely be placed in anobservablesdirectory. Another example would be writing functionality that injects and tracks$SAQ_HOME\data\stats\modulesstatistics to be made available in this metrics lib. Such functionality should go into a new directory, as well.Goals, Enhancement Ideas, To-doCreate a central location, in the ACE repo, for metrics to be managed in a modular and flexible way.Make metrics a library, on the side of, and not directly interconnected with the ACE codebase.Take metric calculations out of the GUI and have the GUI use the metrics library.Make all metrics available via the CLI.Make all or a subset of metrics persistableAutomatically update the metric pivot tables as the ACE database tables are updated. MySQL database replication looks like a viable solution. This library appears promising:python-mysql-replication.One way to do this would be to create a daemon service that can run and continuously update the pivot tables. This will allow for near real-time access to the data and also lay a groundwork for real-time graphical metrics to eventually populate a dashboard.The pivot table metrics could run in memory in something like REDIS, as long as the metrics service is running. AND/OR.. a time-series database.. or explore more options.TO-DOImplement pytest based testing.Make Holidays configurable.Should config items be seperate from ACE or configurable in ACE?Add observable statistics.Distinguish detection points? yara, indicator, etc.
acenda
No description available on PyPI.
ace-nn
ace_nnIntroductionThis repo contains experimental implementation of ace algorithm via neural network. It is shown byxiangxiang-xuthat calculating optimal features byAlternating Conditional Expectationis equivalent to maximizeH-score.How to runThree examples are provided ( one for continuous variable and the other twos are for discrete variable) and their results are the same asace. The main function isace_nnand its parameters are very similar toace_cream.importnumpyasnpfromace_nnimportace_nn# discrete case, binary symmetric channel with crossover probability 0.1N_SIZE=1000x=np.random.choice([0,1],size=N_SIZE)n=np.random.choice([0,1],size=N_SIZE,p=[0.9,0.1])y=np.mod(x+n,2)# set both x(cat=0) and y(cat=-1) as categorical typetx,ty=ace_nn(x,y,cat=[-1,0],epochs=100)# continuous casex=np.random.uniform(0,np.pi,200)y=np.exp(np.sin(x)+np.random.normal(size=200)/2)tx,ty=ace_nn(x,y)For more detail, runhelp(ace_nn)to see the parameters and returns of this function.Further discussionCurrently, the neural networks used to approximate optimal $f(x)$ and $g(y)$ are two-layer MLP withtanhas activation function. More turns of epochs are needed for large alphabet $|\mathcal{X}|$ and $|\mathcal{Y}|$ and the running time is not short.Also,batch_sizeandhidden_units_numcan be hypertuned, and there is no guarantee that current configuration of neural network is optimal for solving ace.Applicationwe can useace_nn(x, y, return_hscore = True)to calculate a lower bound of $\frac{\norm{B}_F^2}{2}$
acentoweb.addusergroup
acentoweb.addusergroupAdds new users automatically to groupsFeaturesAdds settings to the medialog control panelthe Group settings will add all users to these groups on loginInstallationInstall acentoweb.addusergroup by adding it to your buildout:[buildout] ... eggs = acentoweb.addusergroupand then runningbin/buildoutAuthorsEspen Moe-NilssenContributeIssue Tracker:https://github.com/collective/acentoweb.addusergroup/issuesSource Code:https://github.com/collective/acentoweb.addusergroupDocumentation:https://docs.plone.org/foo/barSupportIf you are having issues, please let us know. We have a mailing list located at:[email protected] project is licensed under the GPLv2.ContributorsEspen Moe-Nilssen,[email protected] (unreleased)Initial release. [espenmn]
acentoweb.collectionactions
Tell me what your product doesFeaturesAdds behaviors to Collection so you can move or copy all items to a folder.InstallationInstall acentoweb.collectionactions by adding it to your buildout:[buildout] ... eggs = acentoweb.collectionactionsand then runningbin/buildoutContributeIssue Tracker:https://github.com/espenmn/acentoweb.collectionactions/issuesSource Code:https://github.com/espenmn/acentoweb.collectionactionsSupportIf you are having issues, please let us know ([email protected])LicenseThe project is licensed under the GPLv2.ContributorsEspen Moe-Nilssen,[email protected] (unreleased)Initial release. [espenmn]
acentoweb.ecv
Exports ECV files from Plone CMSFeaturesView to export ECV filesDocumentationFull documentation for end users can be found in the “docs” folder, and is also available online athttp://docs.plone.org/foo/barTranslationsThis product has been translated intoenglish (default=)InstallationInstall acentoweb.ecv by adding it to your buildout:[buildout] ... eggs = acentoweb.ecvand then runningbin/buildoutContributeIssue Tracker:https://github.com/espenmn/acentoweb.ecv/issuesSource Code:https://github.com/espenmn/acentoweb.ecvSupportIf you are having issues, please let us [email protected] project is licensed under the GPLv2.ContributorsEspen Moe-Nilssen,[email protected] (02.11.2022)Initial release. [espenmn]1.1 (03.11.2022)Add view permission and portal action. [espenmn]1.2 (03.11.2022)Remove (uncomment) permissions [espenmn]1.3 (05.12.2022)Added view ecv-display1.4 (09.12.2022)Change id field nameChange ‘ to “1.4.2 (13.12.2022)Add replace in export ( “ => ‘ )
acenus
No description available on PyPI.
acenv
PROJECTAcenv (Algebraic Calculation Environment)A python 3 package that provides some tools to perform operations over complex polynomials objects, useful if you are working on a project that requires your program solving equations and polynomials expression parsing in general.FEATURESFully implemented in Python 3;No external dependencies;Class for representing real numbers in single objects, enabling operations with more complex numbers in an algebraic sense (the sum √2 + √2 returns 2√2, not the approximated float value 2.828...);Class for representing polynomials as single objects, enabling operations with algebraic entities containing one or more literal variables and real numbers;Polynomial factorization engine;Polynomial string parsers, which permits the user to simply type the readable string representation of the object, avoiding tedious object instantiation procedures;Algebraic expression parser, which takes a string representing an expression, simplifies it and can return both the resulting polynomial and the factorized form.PREREQUISITES AND INSTALLATION GUIDEWith the library running without any external dependency, there is no need for you to install any other code than the one provided here. So, to install the library, it is sufficient to use the 'pip' command in terminal:pip install acenvDOCUMENTATIONThe complete documentation should have been downloaded together with the rest of the packageHOW TO USEJust import the components you need:from acenv import Exand start creating the objects you intend to useexpression = Ex('x^2+4(1-x)') print(expression.value.numerator) print(expression.reduced)and the output is'+4-4x^(1)+1x^(2)' [['+2-1x^(1)', '+2-1x^(1)'], ['+1']]For more checking the documentation is highly encouragedNOTESThe project is still in its early stages of development, and it is born as a school project of mine for my ICT class. A lot of work and changes have been done since then, but there are still a lot of thing than can be hugely improved, so every feedback and bug report is widely appreciated.LICENSEMIT license, see LICENSE.txt that should have been downloaded with the package or simply go tohttps://opensource.org/licenses/MIT.CONTACT MEFor any feedback, bug report or problems, you can contact me at:[email protected]
aceofbases
ACEofBASEsis a tool to determine sites to be editted with the CRISPR/Cas9 technology in a input sequence and predict its potential off-target sites. The online version ofACEofBASEsis available athttp://aceofbases.cos.uni-heidelberg.de/This is a command line version ofACEofBASEsthat is designed mainly to allow search of large volume of sequences and higher flexibility.If you use this tool for your scientific work, please cite it as: Cornean, A., Gierten, J., Welz, B., Mateo, J.L., Thumberger, T. and Wittbrodt, J. Stemmer, M., Thumberger, T., del Sol Keyer, M., Wittbrodt, J. and Mateo, J.L.Precise in vivo functional analysis of DNA variants with base editing using ACEofBASEs target prediction.eLife (2022).RequirementsACEofBASEsis implemented in Python and it requires a version 3.5 or above.In addition we relay on the short read aligner Bowtie 1 to identify the off-target sites. Bowtie can be downloaded from this sitehttp://bowtie-bio.sourceforge.net/index.shtmlin binary format for the main platforms. You need to create an indexed version of the genome sequence of your target species. This can be done with the tool bowtie-build included in the Bowtie installation. For that you simply need a fasta file containing the genome sequence. To get the index you can do something like:$ bowtie-build -r -f <your-fasta-file> <index-name>The previous command will create the index files in the current folder.To handle 2bit files and, optionally, gene and exon annotations we use the python librarybx-python.The exon and gene files contain basically the coordinates of those elements inbed format, which are the first three columns of the file. The exon file can contain two more columns with the ID and name of the corresponding gene. You can generate easily such kind of files for you target organism using the scriptgff2bedFilesincluded in this package. As the name of this script suggests, you only need a GFF file with the annotation. Additionally, you can also useEnsembl Biomart, if your species is available there, to generate files complying with these requirements.In case of difficulties with these files contact us and we can provide you the files you need or help to generate them on your own.InstallPlease, refer to the fileINSTALL.md.UsageAfter a successful installation you should have the mainACEofBASEsexecutable, together with the script to generate the gene/exons files, ready to be used. You can runACEofBASEswith the -h flag to get a detailed list of the available parameters. For instance:$ aceofbases -hAt minimum it is necessary to specify the input (multi)fasta file (--input), the Bowtie index (--index) and the 2bit file (--twobit). In this caseACEofBASEsassumes that the Bowtie and blat executables can be found in thePATHsystem variable, there are not gene and exon files to use and the rest of parameters will take default values. Notice that the index parameter to specify here refers to the name of the index, without any file extension, together with the path, if necessary.A command for a typical run will look something like this:$ aceofbases --input <query.fasta> --index <path/index-name> --twobit <file.2bit> --output <output-folder>The result of the run will be three files for each sequence in the input query file. These files will have extension .fasta, .xls and html, containing, respectively, the sequence of the target sites, their detailed information either as tab separated file that can be open with any spreadsheet program or the html file to be open with any web browser. The name of the output file(s) will be taken from the name of the sequences in the input fasta file.Generating Exon/Gene filesFor any species you have to work with it is very likely that there is an annotation file in GFF format. From any of these files you can generate the files thatACEofBASEsneeds to annotate the off-target sites. The scriptgff2bedFilesexpects as first argument the input file inGFF version 3format. Files in this format can be usually found with their corresponding assemblies in the web sites NCBI or Ensembl. With the input file downloaded, it doesn't need to be uncompressed if it is in gz format, specify it as first argument to the script followed by the prefix you prefer for the output files.$ gff2bedFiles <input-gff> <prefix>The result will be two files named<prefix>_exons.bed.gzand<prefix>_genes.bed.gz. These files are compressed, to save space, and can be passed directly toACEofBASEs.Docker imageACEofBASEsis also available as a Docker image athttps://hub.docker.com/r/juanlmateo/aceofbasesThis image contains everything ready to useACEofBASEs. Simply download the image with this commanddocker pull juanlmateo/aceofbases:latestWith this image you can run the commandsaceofbasesandgff2bedFiles, but also you can run Bowtie to create the index of your target species or faToTwoBit to create the 2bit file.Below you have an example that shows how to get CRISPR/Cas candidates for a sequence using the yeast as target species. This example shows all the steps, from creating the Bowtie index, the exon and gene files to the generation of the final output.# downloading the genome of the target species in fasta forma wget ftp://ftp.ensembl.org/pub/release-105/fasta/saccharomyces_cerevisiae/dna/Saccharomyces_cerevisiae.R64-1-1.dna.toplevel.fa.gz # building the bowtie index from the fasta file docker run -v `pwd`:/data/ aceofbases bowtie-build -r -f Saccharomyces_cerevisiae.R64-1-1.dna.toplevel.fa.gz saccharomyces_cerevisiae # downloading the annotation of this assembly in GFF format wget ftp://ftp.ensembl.org/pub/release-105/gff3/saccharomyces_cerevisiae/Saccharomyces_cerevisiae.R64-1-1.105.gff3.gz # generating the exon and gene files docker run -v `pwd`:/data/ aceofbases gff2bedFiles Saccharomyces_cerevisiae.R64-1-1.105.gff3.gz saccharomyces_cerevisiae # generating the 2bit file ocker run -v `pwd`:/data/ aceofbases faToTwoBit Saccharomyces_cerevisiae.R64-1-1.dna.toplevel.fa.gz yeast.2bit # defining the input sequence(s) echo -e ">YDL194W\nATGGATCCTAATAGTAACAGTTCTAGCGAAACATTACGCCAAGAGAAACAGGGTTTCCTA" > test.fa # running ACEofBASEs docker run -v `pwd`:/data/ aceofbases aceofbases --input test.fa --index saccharomyces_cerevisiae --twobit yeast.2bit --exons saccharomyces_cerevisiae_exons.bed.gz --genes saccharomyces_cerevisiae_genes.bed.gz
acepy
Failed to fetch description. HTTP Status Code: 404
acepython
ACEPython - An equilibrium chemistry codeIntroduction|Usage|TauREx 3|Citing ACEPythonIntroductionACEPython is a Python wrapper for the FORTRAN equilibrium chemistry code developed byAgúndez et al. 2012. It can rapidly compute the equilibirum chemical scheme for a given temperature and pressure.InstallationACEPython can be installed with prebuilt wheels using pip:pipinstallacepythonOr, if you prefer, you can build it from source which requires a FORTRAN and C compiler. The following commands will build and install ACEPython:gitclonehttps://github.com/ucl-exoplanets/acepython.gitcdacepython pipinstall.UsageACEPython can be used to compute the equilibrium chemistry for a given temperature and pressure. Temperature and pressure must be created with astropy units. For pressure, any unit can be used (Pa, bar etc). The following example shows how to compute the equilibrium chemistry for a column of atmosphere:fromacepythonimportrun_acefromastropyimportunitsasuimportnumpyasnpimportmatplotlib.pyplotasplttemperature=np.linspace(3000,1000,100)<<u.Kpressure=np.logspace(6,-2,100)<<u.barspecies,mix_profile,mu_profile=run_ace(temperature,pressure,)species_to_see=["H2","H20","CH4","NH3","C2H2","CO","CO2","H2CO"]fig,(ax1,ax2)=plt.subplots(1,2,figsize=(10,5))fori,specinenumerate(species):ifspecinspecies_to_see:ax1.plot(mix_profile[i],pressure,label=spec)ax1.set_yscale("log")ax1.set_xscale("log")ax1.invert_yaxis()ax1.set_ylabel("Pressure (bar)")ax1.set_xlabel("VMR")ax1.legend()ax2.plot(mu_profile,pressure)ax2.set_yscale("log")ax2.invert_yaxis()ax2.set_ylabel("Pressure (bar)")ax2.set_xlabel("Mean molecular weight (au)")plt.show()Should produce the following figure:Custom chemical schemeBy default the elements in the chemical scheme are H, He, C, N, O at log abundances 12, 10.93, 8.39, 7.86, 8.73 respectively. The abundances can be changed by passing the elements and corresponding abundances to therun_acefunction:species,mix_profile,mu_profile=run_ace(temperature,pressure,elements=["H","He","C","N","O"],abundances=[12,10.93,8.39,7.86,7.73],)where we have changedOto have a log abundance of7.73.You can customize the species included by passing in thermochemical and species data files.For example, if we have a custom thermochemical data file calledcustom_thermochemical_data.datand a custom species data file calledcustom_species_data.datthat includes sulphur we can run ACEPython with:species,mix_profile,mu_profile=run_ace(temperature,pressure,elements=["H","He","C","N","O","S"],abundances=[12,10.93,8.39,7.86,7.73,7.0],thermochemical_data="custom_thermochemical_data_w_S.dat",species_data="custom_species_data_w_S.dat",)TauREx3ACEPython also includes a plugin forTauREx 3.1that allows you to use ACEPython as a chemistry scheme. In the input file you can select it in theChemistrysection using acepython with arguments:[Chemistry]chemistry=acepython# He/H ratio (optional)he_h_ratio=0.83# Elements excluding H, He (optional)elements=C,N,O# log abundances (optional)abundances=8.39,7.86,8.73# Custom species data file (optional)spec_file=custom_species_data.dat# Custom thermochemical data file (optional)thermo_file=custom_thermochemical_data.datCiting ACEPythonIf you use ACEPython in your research, please cite the following papers:@ARTICLE{Agundez2012,author={{Ag{\'u}ndez}, M. and {Venot}, O. and {Iro}, N. and {Selsis}, F. and{Hersant}, F. and {H{'e}brard}, E. and {Dobrijevic}, M.},title="{The impact of atmospheric circulation on the chemistry of the hot Jupiter HD 209458b}",journal={A\&A},keywords={astrochemistry, planets and satellites: atmospheres, planets and satellites: individual: HD 209458b, Astrophysics - Earth and Planetary Astrophysics},year="2012",month="Dec",volume={548},eid={A73},pages={A73},doi={10.1051/0004-6361/201220365},archivePrefix={arXiv},eprint={1210.6627},primaryClass={astro-ph.EP},adsurl={https://ui.adsabs.harvard.edu/abs/2012A&A...548A..73A},adsnote={Provided by the SAO/NASA Astrophysics Data System}}@ARTICLE{2021ApJ...917...37A,author={{Al-Refaie}, A.~F. and {Changeat}, Q. and {Waldmann}, I.~P. and {Tinetti}, G.},title="{TauREx 3: A Fast, Dynamic, and Extendable Framework for Retrievals}",journal={\apj},keywords={Open source software, Astronomy software, Exoplanet atmospheres, Radiative transfer, Bayesian statistics, Planetary atmospheres, Planetary science, 1866, 1855, 487, 1335, 1900, 1244, 1255, Astrophysics - Instrumentation and Methods for Astrophysics, Astrophysics - Earth and Planetary Astrophysics},year=2021,month=aug,volume={917},number={1},eid={37},pages={37},doi={10.3847/1538-4357/ac0252},archivePrefix={arXiv},eprint={1912.07759},primaryClass={astro-ph.IM},adsurl={https://ui.adsabs.harvard.edu/abs/2021ApJ...917...37A},adsnote={Provided by the SAO/NASA Astrophysics Data System}}@ARTICLE{2022ApJ...932..123A,author={{Al-Refaie}, A.~F. and {Changeat}, Q. and {Venot}, O. and {Waldmann}, I.~P. and {Tinetti}, G.},title="{A Comparison of Chemical Models of Exoplanet Atmospheres Enabled by TauREx 3.1}",journal={\apj},keywords={Open source software, Publicly available software, Chemical abundances, Bayesian statistics, Exoplanet atmospheres, Exoplanet astronomy, Exoplanet atmospheric composition, Exoplanets, Radiative transfer, 1866, 1864, 224, 1900, 487, 486, 2021, 498, 1335, Astrophysics - Earth and Planetary Astrophysics, Astrophysics - Instrumentation and Methods for Astrophysics},year=2022,month=jun,volume={932},number={2},eid={123},pages={123},doi={10.3847/1538-4357/ac6dcd},archivePrefix={arXiv},eprint={2110.01271},primaryClass={astro-ph.EP},adsurl={https://ui.adsabs.harvard.edu/abs/2022ApJ...932..123A},adsnote={Provided by the SAO/NASA Astrophysics Data System}}
aceql
AceQL HTTPPython Client SDK v5.7.2 - User GuideMarch 2, 2023FundamentalsLicensePython Versions & DB-API 2.0AceQL Server side compatiblityInstallationData transportBest practices for fast response timeDatatypesUsageQuickstartHandling ExceptionsThe error typeMost common AceQL server messagesHTTP Status CodesAdvanced UsageManaging NULL valuesSetting NULL valuesReading NULL valuesTransactionsProxiesTimeoutsBLOB managementBLOB creationBLOB readingManaging BLOB upload progressCalling AceQL Java stored proceduresUsing outer authentication without a password and with an AceQL Session IDUsing the Metadata Query APIDownloading database schema into a fileAccessing remote database main propertiesGetting Details of Tables and ColumnsFundamentalsThis document describes how to use the AceQL SDK / module and gives some details about how it operates with the AceQL Server side.The AceQL SDK / module allows you to wrap theAceQL HTTP APIsto access remote SQL databases and/or SQL databases in the cloud by simply including standard Python SQL calls in your code, just like you would do for any local database. There is zero learning curve and usage is straightforward.The AceQL Server operation is described inAceQL HTTP Server Installation and Configuration Guide, whose content is sometimes referred to in his User Guide.On the remote side, like the AceQL Server access to the SQL database using Java JDBC, we will sometimes use the JDBC terminology (ResultSet, etc.) in this document. Nevertheless, knowledge of Java or JDBC isnota requirement.LicenseThe SDK is licensed with the liberalApache 2.0license.Python Versions & DB-API 2.0The module supports Python 3.7–3.11.The module provides a SQL interface compliant with the DB-API 2.0 specification described byPEP 249.AceQL Server side compatiblityThis SDK version requires AceQL HTTP server side v12.2+.Installationpip install aceqlData transportHTTP requests parameters are transported in UTF-8 format and JSON format is used for data and class transportAll requests are streamed:Output requests (from the client side) are streamed directly from the socket to the server to avoid buffering any content bodyInput responses (for the client side) are streamed directly from the socket to the server to efficiently read the response bodyLarge content (query results, Blobs/Clobs, etc.) is transferred using files. It is never loaded in memory. Streaming techniques are always used to read and write this content.Best practices for fast response timeEvery HTTP exchange between the client and server side is time-consuming, because the HTTP call is synchronous and waits for the server's responseTry to avoid coding SQL calls inside loops, as this can reduce execution speed. Each SQL call will send an http request and wait for the response from the server.Note that AceQL is optimized as much as possible. A SELECT call returning a huge data volume will not consume memory on the server or client side: AceQL uses input stream and output stream I/O for data transfer.Server JDBC ResultSet retrieval is as fast as possible :The ResultSet creation is done once on the server by the executeQuery.The rows are all dumped at once on the servlet output stream by the serverThe client side gets the ResultSet content as a file.All data reading commands are executed locally on the client side with forward-only reading.It is highly recommended to always useCursor.executemanywith autocommit off when you have many rows to INSERT or UPDATE.DatatypesThe main server side JDBC data types for columns are supported:Boolean,Blob/Clob,Integer,Short,Double,Float,BigDecimal,Long,String,Date,Time, andTimestamp.Note that the AceQL module does not allow you to specify data types to use; data types are implicitly chosen by the module.Parameter values are automatically converted to their SQL equivalent. The following Python types can thus be sent to remote server without any problem:Python type / classSQL typeTuple (None, SqlNullType.<SQL_TYPE>)NULLstrCHAR, VARCHARintINTEGER, or BIGINT,depending on sizeboolBIT, BOOLfloatREALdateDATEdatetimeTIMESTAMPtimeTIMEFile ObjectBLOBNULL and BLOB types are explained in Advanced Usage.This is how SQL types are converted to Python types by default:SQL typePython type / classNULLNoneCHAR, VARCHARstrTINYINT, INTEGERintBIGINTintBOOL, BITboolDATEdateDECIMAL, DOUBLE_PRECISION, FLOAT, NUMERIC, REALfloatTIMESTAMPdatetimeTIMEtimeBLOBResponse streamUsageQuickstartTo use the module, just create aConnectionobject that represents the database:importaceql# URL of the AceQL server, Remote SQL database name & authentication infourl="https://www.acme.com:9443/aceql"database="sampledb"username="user1"password="password1"connection=aceql.connect(url=url,username=username,password=password,database=database)The alternate syntax with parameters in the URL is supported:url="https://www.acme.com:9443/aceql?username=user1&password=password1&database=sampledb"connection=aceql.connect(url=url)The schema of the database is here:sampledbOnce you have aConnection, you can create aCursorobject and call itsexecute()method to perform SQL commands.Following sample shows how to insert 3 new customers using prepared statements:sql="insert into customer values (?, ?, ?, ?, ?, ?, ?, ?)"params=(1,'Sir','John','Smith I','1 Madison Ave','New York','NY 10010','+1 212-586-7001')cursor.execute(sql,params)rows_inserted=cursor.rowcountsql="insert into customer values (?, ?, ?, ?, ?, ?, ?, ?)"params=(2,'Sir','William','Smith II','1 Madison Ave','New York','NY 10010','+1 212-586-7002')cursor.execute(sql,params)rows_inserted+=cursor.rowcountsql="insert into customer values (?, ?, ?, ?, ?, ?, ?, ?)"params=(3,'Sir','William','Smith III','1 Madison Ave','New York','NY 10010','+1 212-586-7003')cursor.execute(sql,params)rows_inserted+=cursor.rowcountprint("rows inserted: "+str(rows_inserted))which returns:rowsinserted:3Thecursor.execute()sends the SQL order and the parameters to the server who executes it on.We view the first inserted customer:sql="select * from customer where customer_id = ?"params=(1,)cursor.execute(sql,params)row=cursor.fetchone()print(row)which returns:(1,'Sir ','John','Smith','1 Madison Ave','New York','NY 10010 ','+1 212-586-7000')The remote result set is downloaded into a file that is read line per line at eachCursor.fetchone()call.We have access to the name and type of each column:fordescincursor.description:print(desc[0]+", "+desc[1])Which returns:customer_id, INTEGER customer_title, CHAR fname, VARCHAR lname, VARCHAR addressline, VARCHAR town, VARCHAR zipcode, CHAR phone, VARCHARIt's recommended to close theCursorat end of SELECT usage in order to release the underlying file stream and delete the associated temp file:cursor.close()We view now all the customers and apply good practice to force the closing ofCursor:withclosing(connection.cursor())ascursor:sql="select * from customer where customer_id >= ? order by customer_id"params=(1,)cursor.execute(sql,params)print("rows: "+str(cursor.rowcount))rows=cursor.fetchall()forrowinrows:print(row)Which returns:(1,'Sir ','John','Smith','1 Madison Ave','New York','NY 10010 ','+1 212-586-7001')(2,'Sir ','William','Smith II','1 Madison Ave','New York','NY 10010 ','+1 212-586-7002')(3,'Sir ','William','Smith III','1 Madison Ave','New York','NY 10010 ','+1 212-586-7003')rows:3At end of our session, it is highly recommended to close theConnection:# Make sure connection is always closed in order to close and release# server connection into the pool:connection.close()Handling ExceptionsExcept forTypeError, Exceptions thrown are always an instance ofErrorTheErrorcontains 5 pieces of information:InfoDescriptionReasonThe error message. Retrieved withreasonproperty.Error TypeSee below for description. Retrieved witherror_typeproperty.ExceptionThe originalExceptionthat is the cause, if any. Retrieved withcauseproperty.Http Status CodeSee below for description. Retrieved withhttp_status_codeproperty.Server ExceptionThe Java Exception Stack Trace thrown on server side, if any. Retrieved withremote_stack_traceproperty.The error typeThe error type allows you to get the type of error, and where the error occurred. It is retrieved with theError.error_typeproperty.Error Type ValueDescription0The error occurred locally on the client side. Seehttp_status_codeproperty for more info. Typical cases: no Internet connection, proxy authentication required.1The error is due to a JDBC Exception. It was raised by the remote JDBC Driver and is rerouted by AceQL as is. The JDBC error message is accessible viareasonproperty. Typical case: an error in the SQL statement.Examples: wrong table or column name.2The error was raised by the AceQL Server. This means that the AceQL Server expected a value or parameter that was not sent by the client side. Typical cases: misspelling in URL parameter, missing required request parameters, JDBC Connection expiration, etc. The detailed error message is accessible viareasonproperty. See below for most common AceQL Server error messages.3The AceQL Server forbade the execution of the SQL statement for a security reason. For security reasons,reasonproperty gives access to voluntarily vague details.4The AceQL Server is on failure and raised an unexpected Java Exception. The stack track is included and accessible viaremote_stack_traceproperty.Most common AceQL server messagesAceQL Sever Error Messages (Error.error_type = 2)AceQL main servlet not found in pathAn error occurred during Blob downloadAn error occurred during Blob uploadBlob directory defined in DatabaseConfigurator.getBlobDirectory() does not existConnection is invalidated (probably expired)Database does not existInvalid blob_id. Cannot be used to create a fileInvalid blob_id. No Blob corresponding to blob_idInvalid session_idInvalid username or passwordNo action found in requestUnable to get a ConnectionUnknown SQL action or not supported by softwareHTTP Status CodesThe HTTP Status Code is accessible with theError.http_status_codeproperty. The HTTP Status Code is 200 (OK) on successful completion calls.When an error occurs:If error type is 0, the HTTP Status Code is returned by the client side and may take all possible values in a malformed HTTP call.If error type is > 0, the HTTP Status Code can take one the following values returned by the server side:HTTP Status CodeDescription400 (BAD REQUEST)Missing element in URL pathMissing request parametersAll JDBC errors raised by the remote JDBC Driver401 (UNAUTHORIZED)Invalid username or password in connectInvalid session_idThe AceQL Server forbade the execution of the SQL statement for security reasons404 (NOT_FOUND)BLOB directory does not exist on serverBLOB file not found on server500 (INTERNAL_SERVER_ERROR)The AceQL Server is on failure and raised an unexpected Java ExceptionAdvanced UsageManaging NULL valuesSetting NULL valuesNULLvalues are handled in a specific way, because the remote server must know the type of theNULLvalue.To create aNULLvalue parameter, create a tuple of 2 elements:First value isNone.Second value is a one of theSqlNullTypeconstants that defines the type of the parameter.This 2 elements tuple is then inserted in the tuple of the prepared statement parameters:sql="insert into customer values (?, ?, ?, ?, ?, ?, ?, ?)"params=(4,'Sir','William','Smith IV','1 Madison Ave','New York','NY 10010',(None,SqlNullType.VARCHAR))cursor.execute(sql,params)Reading NULL valuesANULLcolumn value is returned asNone:sql="select * from customer_3 where customer_id = ? order by customer_id"params=(4,)cursor.execute(sql,params)row=cursor.fetchone()print(row)Execution will return:(4,'Sir ','William','Smith IV','1 Madison Ave','New York','NY 10010 ',None)In this AceQL module version: there is no difference for string columns between a real NULL in the database and the ''NULL" string.TransactionsTransactions are supported by the module. Because the remote server executes JDBC code, client code must follow the JDBC requirement to set the auto commit mode to false prior executing a transaction.This is done withCursor.set_auto_commit(False). It is good practice to always reset auto commit mode to true at end of your transactions. Not that it auto commit mode state is undefined when aConnectionis created withaceql.connect()call.Transaction example:# To do prior transactionself.connection.set_auto_commit(False)cursor=self.connection.cursor()try:# Create a Customersql="insert into customer values (?, ?, ?, ?, ?, ?, ?, ?)"params=(customer_id,'Sir','John','Smith','1 Madison Ave','New York','NY 10010','+1 212-586-7000')cursor.execute(sql,params)# Create an Order for this Customersql="insert into orderlog values ( ?, ?, ?, ?, ?, ?, ?, ?, ? )"the_datetime=datetime.now()the_date=the_datetime.date()# (None, SqlNullType.BLOB) means to set the jpeg_image BLOB# column to NULL on server:params=(customer_id,item_id,"Item Description",9999,the_date,the_datetime,(None,SqlNullType.BLOB),1,2)cursor.execute(sql,params)self.connection.commit()exceptErrorase:print(e)self.connection.rollback()raiseefinally:self.connection.set_auto_commit(True)# Good practicecursor.close()ProxiesThe AceQL module support proxies, using theproxysyntax ofRequestsThe aceql module uses Requests for HTTP communications with the remote server. All options of a new AceQL connection are passed with theConnectionOptionswrapper.importaceqlfromaceqlimportConnectionOptions# URL of the AceQL server, Remote SQL database name & authentication infourl="https://www.acme.com:9443/aceql"database="sampledb"username="user1"password="password1"proxies={'http':'http://10.10.1.10:3128','https':'http://10.10.1.10:1080',}# We use the ConnectionOptions wrapper class to pass the proxiesconnection_options=ConnectionOptions(proxies=proxies)connection=aceql.connect(url=url,username=username,password=password,database=database,connection_options=connection_options)Authenticated proxies are supported. Just create anaceql.ProxyAuthinstance and pass it toaceql.connect()with theConnectionOptionswrapper.importaceqlfromaceqlimportConnectionOptionsfromaceqlimportProxyAuthfromsamplesimportmy_proxy# URL of the AceQL server, Remote SQL database name & authentication infourl="https://www.acme.com:9443/aceql"database="sampledb"username="user1"password="password1"proxies={"http":"http://localhost:8081",}# Get the proxy credentials with our own application methodsproxy_username=my_proxy.get_username()proxy_password=my_proxy.get_password()# The AceQL ProxyAuth class allows to define the proxy credentialsauth=ProxyAuth(proxy_username,proxy_password)# We use the ConnectionOptions wrapper class to pass both the proxies & the authconnection_options=ConnectionOptions(proxies=proxies,auth=auth)connection=aceql.connect(url=url,username=username,password=password,database=database,connection_options=connection_options)The AceQL module usesrequests-toolbeltfor authenticated proxy management.TimeoutsUse thetimeoutparameter ofConnectionOptionsto pass a timeout value in theRequests Timeoutsformat.If no timeout is specified explicitly, requests do not time out.connection_options=ConnectionOptions(timeout=10)connection=aceql.connect(url=url,username=username,password=password,database=database,connection_options=connection_options)BLOB managementThe AceQL module supports BLOB creation and reading. Methods are implemented using streaming techniques to keep low memory consumption. CLOBs are not supported in this version.BLOB creationBLOB creation is supported by passing a tuple with a File Object as parameter of a prepared statement:sql="insert into orderlog values ( ?, ?, ?, ?, ?, ?, ?, ?, ? )"filename=os.getcwd()+sep+"item_1_image.png"fd=open(filename,"rb")# File will be closed by AceQLblob_tuple=(fd,)params=(1,1,"Item 1 Description",9999,datetime.now(),datetime.now().date(),blob_tuple,1,2)cursor.execute(sql,params)BLOB readingBLOB reading is supported throughCursor.get_blob_stream(column_index). The stream can then be read with aforloop that iterates on therespsonse, using syntax provided byRequests:sql="select customer_id, item_id, jpeg_image from orderlog "\"where customer_id = ? and item_id = ?"params=(1,1)cursor.execute(sql,params)row=cursor.fetchone()# You can get BLOB length if you want to use a progress indicatorblob_length=cursor.get_blob_length(2)print("blob length: "+str(blob_length))# Get the stream to the remote BLOBresponse=cursor.get_blob_stream(2)# Download is streamed and writen into filenamefilename=os.path.expanduser("~")+sep+"jpeg_image.jpg"withopen(filename,'wb')asfd:forchunkinresponse.iter_content(chunk_size=2048):fd.write(chunk)stat_info=os.stat(filename)print("file length: "+str(stat_info.st_size))Managing BLOB upload progressYou may want to give your users a progress bar when uploading BLOBs.TheProgressIndicator.Percentproperty allows you to get the current percent of upload. Value will be incremented automatically during upload.To activate the update mechanism:1/ Set the long BLOB length along the File Object in the tuple of the BLOB prepared statement parameter:file_length=os.stat(filename).st_sizefd=open(filename,"rb")blob_tuple=(fd,file_length)2/ Create yourProgressIndicatorinstance and enter it to theConnectioninstance before theCursor.execute(sql, params)call :progress_indicator=ProgressIndicator()connection.set_progress_indicator(progress_indicator)You then can readProgressIndicator.percentproperty in your watching thread.Code sample:withclosing(connection.cursor())ascursor:filename=os.getcwd()+sep+"item_1_image.jpg"file_length=os.stat(filename).st_sizefd=open(filename,"rb")blob_tuple=(fd,file_length)progress_indicator=ProgressIndicator()connection.set_progress_indicator(progress_indicator)sql="insert into orderlog values ( ?, ?, ?, ?, ?, ?, ?, ?, ? )"params=(1,1,"Item 1 Description",9999,datetime.now(),datetime.now().date(),blob_tuple,1,2)# cursor.execute() uploads BLOB by chunks and increments# ProgressIndicator.percent propertycursor.execute(sql,params)Calling AceQL Java stored proceduresThe AceQL client SDK allows executing a remote server Java class that implements the AceQL Serverorg.kawanfw.sql.api.server.executor.ServerQueryExecutorinterface and that returns the rows of the SELECT.See theorg.kawanfw.sql.api.server.executor.ServerQueryExecutorJavadoc.The usage on the client side is straightforward with theCursor.ExecuteServerQueryAsync()method:server_query_executor_class_name="com.mycompany.MyServerQueryExecutor"my_parameters=[1]cursor.execute_server_query(server_query_executor_class_name,my_parameters)print("cursor.rowcount : "+str(cursor.rowcount))rows=cursor.fetchall()print("fetchall:")forrowinrows:print(row)cursor.close()Using outer authentication without a password and with an AceQL Session IDSome working environments (Intranet, etc.) require that the client user authenticates himself without a password. Thus, it is not possible for this users to authenticate though the AceQL client SDK.In this case, you may use directly the native HTTPloginAPI to authenticate the users and retrieve thesession_idreturned by the API. Just pass the value of thesession_idto theConnectionOptionsinstance, along with aNonepasswordtoconnect.importaceqlfromaceqlimportConnectionOptionsfromsamplesimportmy_session_id_builder# URL of the AceQL server, Remote SQL database name & authentication info without passwordurl="https://www.acme.com:9443/aceql"database="sampledb"username="user1"# Our application will get the session_id to usesession_id=my_session_id_builder.get_session_id_from_login_api()# We use the ConnectionOptions wrapper to tell to use session_id instead of a passwordconnection_options=ConnectionOptions(session_id=session_id)connection=aceql.connect(url=url,username=username,password=None,database=database,connection_options=connection_options)Using the Metadata Query APIThe metadata API allows:downloading a remote database schema in HTML or text format,to get a remote database main properties,to get the list of tables,to get the details of each table.It also allows wrapping remote tables, columns, indexes, etc. into easy to use provided Python classes: Table, Index, Column, etc.First step is to get an instance ofRemoteDatabaseMetaData:remote_database_meta_data=RemoteDatabaseMetaData(connection)Downloading database schema into a fileDownloading a schema into aFileis done through thedb_schema_download(filename)method. See theRemoteDatabaseMetaDatadocumentation:filename=os.path.expanduser("~")+os.sep+"db_schema.html"remote_database_meta_data.db_schema_download(filename)See an example of the built HTML schema:db_schema.out.htmlAccessing remote database main propertiesTheJdbcDatabaseMetaDataclass wraps instance the main value retrieved by a remote server JDBC call tojava.sql.Connection.getMetaData():jdbc_meta_data=remote_database_meta_data.get_jdbc_database_meta_data()print("Major Version: "+str(jdbc_meta_data.getJDBCMajorVersion))print("Minor Version: "+str(jdbc_meta_data.getJDBCMinorVersion))print("IsReadOnly : "+str(jdbc_meta_data.isReadOnly))Getting Details of Tables and ColumnsSee theRemoteDatabaseMetaDatadocumentation:print("Print the column details of each table:")fortable_nameintable_names:table=remote_database_meta_data.get_table(table_name)print()print("Columns of table: "+table_name)forcolumnintable.columns:print(column)
acequia
ACEQUIAAcequia is a python package to facilitate data management for ground water time series. It provides tools for Dutch groundwater practitioners who deal with files from Dinoloket, Menyanthes, and KNMI precipitation files.Current functionalityRead groundwater head data from Dinoloket csv files and Menyanthes Hydromonitor csv files.Read knmi precipitation and evaporation data from KNMI csv files or download data directly from the KNMI website.Calculate descriptive statistics for groundwater head series, taking into account hydrological years and measurments taken on the 14th and 28th of eacht month.Getting startedAcequia depends on Fiona for reading spatial data. Unfortunately, Fiona depends on GDAL which can not be installed using pip. Therefore Fiona must be installed on your machine before you can install Acequia.For example, if you are using a clean conda environment with python installed you can do:>>> conda install fiona >>> pip install acequiaAcequia depends on the following packages:numpy, maplotlib, pandas, scipy, statsmodels, seaborn, geopandas, simplekml.Basic exampleAs a very basic example, read a Dinoloket csv file named B28A0475002_1.csv and resample to measurements on the 14th and 28th:>>> import acequia as aq >>> gw = aq.GwSeries.from_dinogws('B28A0475002_1.csv') >>> sr = gw.heads(ref='datum') >>> sr1428 = gw.heads1428(maxlag=3)```
acer
Aho-Corasick by Paper:https://pdfs.semanticscholar.org/3547/ac839d02f6efe3f6f76a8289738a22528442.pdf Reference:https://www.cs.uku.fi/~kilpelai/BSA05/lectures/slides04.pdf
acerim
ACERIM
acerlinnester
Example PackageThis is a nested list package. You can useGithub-flavored Markdownto write your content.
acertmgr
ACERTMGRThis is an automated certificate manager using ACME/letsencrypt with minimal dependencies.Running ACERTMGRThe main file acertmgr.py is intended to be run regularly (e.g. as daily cron job / systemd timer) as root or user with enough privileges.RequirementsPython (2.7+ and 3.5+ should work)cryptography>=0.6Optional requirements (to use specified features)PyYAML: to parse YAML-formatted configuration filesdnspython: used by dns.* challenge handlersidna: to allow automatic conversion of unicode domain names to their IDNA2008 counterpartscryptography>=2.1: for creating certificates with the OCSP must-staple flag (cert_must_staple)cryptography>=2.6: for usage of Ed25519/Ed448 keysSetupYou should decide which challenge mode you want to use with acertmgr:webdir: In this mode, responses to challenges are put into a directory, to be served by an existing webserverstandalone: In this mode, challenges are completed by acertmgr directly. This starts a webserver to solve the challenges, which can be used standalone or together with an existing webserver that forwards request to a specified local port/address.dns.*: This mode puts the challenge into a TXT record for the domain (usually _acme-challenge.) where it will be parsed from by the authoritydns.* (Alias mode): Can be used similar to the above but allows redirection of _acme-challenge. to any other (updatable domain) defined in dns_updatedomain via CNAME (e.g. _acme-challenge.example.net IN CNAME bla.foo.bar with dns_updatedomain="bla.foo.bar" in domainconfig)dns.nsupdate: Updates the TXT record using RFC2136You can optionally provide the private key files to be used with the ACME protocol (if you do not they will be automatically created):The account private key is (by default) expected at/etc/acertmgr/account.key(used to register an account with the authorities server)The domain private keys are (by default) expected at/etc/acertmgr/{cert_id}.keyIf you are missing these keys, they will be created for you (using RSA with the configured key_length) or you can create them using e.g.openssl genrsa 4096 > /etc/acertmgr/account.keyDo not forget to set proper permissions of the keys usingchmod 0400 /etc/acertmgr/*.keyif you created those manuallyFinally, you need to setup the configuration files, as shown in the next section. While testing, you can use the acme-staging authority instead, in order to avoid issuing too many certificates.Authorities (e.g. our default Let's Encrypt) will require you to accept their Terms of Service. This can be done either in the optional global config file and/or via a commandline parameter (see acertmgr.py --help).ConfigurationConfiguration examples are included in thedocs/directory. All configuration files can use yaml (requires PyYAML) or json syntax. (Note: The JSON examples may be incomplete due to inability to express comments in JSON)Unless specified with a commandline parameter (see acertmgr.py --help) the optional global configuration is read from/etc/acertmgr/acertmgr.conf. Domains for which certificates should be obtained/renewed are be configured in/etc/acertmgr/*.conf(the global configuration is always excluded if it is in the same directory). By default the directory (work_dir) containing the working data (csr,certificate,key and ca files) is located at/etc/acertmgr/.4 configuration contexts are known (domainconfig (d) > globalconfig (g) > commandline (c) > built-in defaults) with the following directives (subject to change, usual usage context written bold):DirectiveContextDescriptionBuilt-in Default-c/--config-filecglobal configuration file (optional)/etc/acertmgr/acertmgr.conf-d/--config-dircdirectory containing domain configuration files (ending with .conf, globalconfig will be excluded automatically if in same directory)/etc/acertmgr/*.conf-w/--work-dircworking directory containing csr/certificates/keys/ca files/etc/acertmgr--force-renewc(or --renew-now) Immediately renew all certificates containing the given domain(s)--revokecRevoke the certificate at the given path--revoke-reasoncProvide a reason code for the revocation (seehttps://tools.ietf.org/html/rfc5280#section-5.3.1for valid values)domain (san-domain...):d(domainconfig section start) Domains to use in the cert request. This value will be MD5-hashed as cert_id.apid,gDetermines the API version usedv2authorityd,gURL to the certificate authorities ACME API root (without trailing /directory or similar)https://acme-v02.api.letsencrypt.orgauthority_tos_agreementd,g,cIndicates agreement to the ToS of the certificate authority (--authority-tos-agreement on command line)authority_contact_emaild,g(v2 API only) Contact e-mail to be registered with your account keyaccount_keyd,gPath to the account key{work_dir}/account.keyaccount_key_algorithmd,gKey-algorithm for newly generated account keys (RSA, EC, ED25519, ED448)RSAaccount_key_lengthd,gKey-length for newly generated RSA account keys (in bits) or EC curve (256=P-256, 384=P-384, 521=P-521)depends on account_key_algorithmttl_daysd,gRenew certificate if it has less than this value validity left30validate_ocspd,gRenew certificate if it's OCSP status is REVOKED. Allowed values for this key are: false, sha1, sha224, sha256, sha384, sha512sha1 (as mandated by RFC5019)cert_dird,gDirectory containing all certificate related data (crt,key,csr){work_dir}key_algorithmd,gKey-algorithm for newly generated private keys (RSA, ECC, ED25519, ED448)RSAkey_lengthd,gKey-length for newly generated RSA private keys (in bits) or EC curve (256=P-256, 384=P-384, 521=P-521)depends on key_algorithmcsr_staticd,gWhether to re-use a static CSR or generate a new dynamic CSRfalsecsr_filed,gPath to store (and load) the certificate CSR file{cert_dir}/{cert_id}.csrca_staticd,gWhether to re-use a static CA or download a CA filefalseca_filed,gPath to store (and load) the certificate authority file{cert_dir}/{cert_id}.cacert_filedPath to store (and load) the certificate file{cert_dir}/{cert_id}.crtcert_revoke_supersededd,gRevoke the previous certificate with reason "superseded" after successful deploymentfalsecert_must_stapled,gGenerate a certificate (request) with the OCSP must-staple flag (will be honoured on the next newly generated CSR if using csr_static=true)falsekey_filed,gPath to store (and load) the private key file{cert_dir}/{cert_id}.keymoded,gMode of challenge handling usedstandalonewebdird,g[webdir] Put acme challenges into this path/var/www/acme-challenge/http_verifyd,g[webdir/standalone] Verify challenge before attempting authorizationtruebind_addressd,g[standalone] Serve the challenge using a HTTP server on given IPportd,g[standalone] Serve the challenge using a HTTP server on this port80dns_ttld,g[dns.*] Write TXT records with this TTL (also determines the update wait time at twice this value60dns_updatedomaind,g[dns.*] Write the TXT records to this domain (you have to create the necessary CNAME on the real challenge domain manually)dns_verify_intervald,g[dns.*] Do verification checks when starting the challenge every {dns_verify_interval} seconds10dns_verify_failtimed,g[dns.*] Fail challenge TXT record verification after {dns_verify_failtime} seconds{dns_waittime} + 1dns_verify_waittimed,g[dns.*] Assume DNS challenges are valid after {dns_verify_waittime}2 * {dns_ttl}dns_verify_all_nsd,g[dns.*] Verify DNS challenges by querying all known zone NS servers (resolved by zone master from SOA or dns_verify_server)falsedns_verify_serverd,g[dns.*] Verify DNS challenges by querying this DNS server unless 'dns_verify_all_ns' is enabled, then use to determine zone NSnsupdate_serverd,g[dns.nsupdate] DNS Server to delegate the update to{determine from zone SOA}nsupdate_verifyd,g[dns.nsupdate] Verify TXT record on the update server upon creationtruensupdate_keyfiled,g[dns.nsupdate] Bind-formatted TSIG key file to use for updates (may be used instead of nsupdate_key*)nsupdate_keynamed,g[dns.nsupdate] TSIG key name to use for updatesnsupdate_keyvalued,g[dns.nsupdate] TSIG key value to use for updatesnsupdate_keyalgorithmd,g[dns.nsupdate] TSIG key algorithm to use for updatesHMAC-MD5.SIG-ALG.REG.INTdefaults:gDefault deployment action settings used by all domainspathd(deployment) deploy certificate data to the given fileformatd,g(defaults)(deployment) deploy one or more of the following data to the file at path: key,crt,causerd,g(defaults)(deployment) change the user of the file deployed at path to this value (optional, defaults to acertmgr current effective user)groupd,g(defaults)(deployment) change the group of the file deployed at path to this value (optional,defaults to acertmgr current effective group)permd,g(defaults)(deployment) change the permissions of the file deployed at path to this value (optional, CAUTION: uses system defaults for new files)actiond,g(defaults)(deployment) run the following action after deployment is finished. This command will be run in a shell and supports it's syntax. (optional)SecurityPlease keep the following in mind when using this software:DO read the source code, since it (usually) will be run as rootMake sure that your configuration files are NOT writable by other users - arbitrary commands can be executed after updating certificatesTry to run this program non-privileged if possible. This requires you to:Create a dedicated user for acertmgr (e.g. acertmgr)Run a acertmgr as that user (add acertmgr to that users cron!)Access rights to read/write all files configured with the created userRun any programs/scripts defined on cert update as the created user (might need work-arounds with sudo or wrapper scripts)
aces
UNKNOWN
aces-aces
Failed to fetch description. HTTP Status Code: 404
aces-apps
ACES Applications (aces-apps)ASKAP Comissioning and Early Science python packages.Provides:A collection of tools developed by members of the ACES team to support ACES processing:aces.askapdata- Tools for retrieving ASKAP observation metadata including:scheduling block infoaces.beamset- beamset classes supporting beam analysisaces.cleanlog- Tools for analysing imager deconvolution (clean) logsaces.data- Antenna positions.aces.display- Tools for displaying data summariesaces.fov- Tools for field-of-view modellingaces.holography- Tools for processing holography data to measure beams:Extract a holography beamset from an ASKAP measurement setaces.misc- Tools miscellaneous tasksaces.mpfit- Perform Levenberg-Marquardt least-squares minimization.The original version of this software, called LMFIT, was written in FORTRAN as part of the MINPACK-1 package.aces.obsplan- Tools for planning ASKAP observations including:Footprint utilitiesaces.sefd- Tools for estimating ASKAP sensitivitySEFD from 1934 dataaces.survey- Tools for processing ASKAP survey dataInstallationBelow are some instructions that attempt to outline the way to install for most situations.The primary dependencies and build instructions forpipare located inpyproject.toml.Currently, the installation depency for theaskapmodule is defined inaskap-requirements.txt.'I just want it deployed'This is intended if you just want to install the tooling in a modular and simple fashion.NOTE: You'll need toinstall condaon your system first.# If using ssh:gitclonessh://[email protected]:7999/aces/aces-apps.git# If using https:gitclonehttps://bitbucket.csiro.au/scm/aces/aces-apps.gitcdaces-apps# Optional: Checkout a specific version of aces-apps. By default, this will be the latest (`master`) version.gitcheckout{VERSION}# Create a conda environment with optional name. If you don't specify a name (with -n {ENVIRONMENTNAME}), it will be called 'aces'condaenvcreate-n{ENVIRONMENTNAME}# Activate the environmentcondaactivate{ENVIRONMENTNAME}Please note though thatVERSIONandENVIRONMENTNAMEare simply placeholders and need to be updated appropriately.In the above, the following occurs:The repository is cloned into a new directory (usingsshorhttps)A desired version is optionally checked outAcondaenvironment is created installing the software.NOTE: If you have any error aroundmpi4pyon a Pawsey system such asgalaxy, see the section below that outlines the modules that need to be loaded.Slightly more advancedThis outlines a process that gives a little more granularity for bespoke processes and use cases. Installingaces-appsshould be a fairly straightforward process.The current version (1.3.0) expects a python 3.9 installation. It is recommended that acondavirtual environment is created with this version:condacreate-n{ENVIRONMENTNAME}python=3.9-yThis will create a new conda environment with the appropriate python version installed.Next, activate this environment withcondaactivate{ENVIRONMENTNAME}Before issuing the next set ofpipcommands, please first take note whether you will be installing this package on a system with specialised MPI tooling. For example, Cray style HPC. If so, refer to the next section.Once thecondaenvironment you previously installed has been activated (and thempitooling requirements have been dealt with), installaces-appsshould be straight-forward.# If using ssh:pipinstallgit+ssh://[email protected]:7999/aces/[email protected]# If using https:pipinstallgit+https://bitbucket.csiro.au/scm/aces/[email protected] command should clone theaces-appsrepository to a temporary space, checkout the1.3.0tag, and use thepyproject.tomlfile to identify dependencies, and their version specifications.Please note the @1.3.0 tagwhich will install a specific version of theaces-appstooling.If anmpi4pycompilation error is thrown, the requiredmpilibraries and environments are likely not correctly defined. Please refer to the next section if using Galaxy/Pawsey, or the documentation for your HPC system.Finally, theaskappython module needs to be installed (or rather, a subset of the complete module). This is being performed as a final step as its current layout is not well suited for automated installation with typical tooling. Issue these these commands:pipinstall-rrequirements.txt# - OR -pipinstallgit+https://bitbucket.csiro.au/scm/tos/python-askap.git#egg=askap pipinstallgit+https://bitbucket.csiro.au/scm/tos/python-parset.git#egg=askap.parset pipinstallgit+https://bitbucket.csiro.au/scm/tos/python-footprint.git#egg=askap.footprintSpecifically, thepython-askapmodule is actually a small set of independent modules. The above set ofpipcommands will install each module instal a common namespace as a submodule. The use of theegg=askapargument in the endpoint is important. If a specific version ofpython-askapis required this will need be set using the [email protected] of notation.Installing as a developerIf you are trying to develop or test code as part of theaces-appstooling, the above instructions are mostly correct, but some slight tweaks are needed.First, clone the repository and checkout the appropriate tag/branch.# If using ssh:gitclonessh://[email protected]:7999/aces/aces-apps.git# If using https:gitclonehttps://bitbucket.csiro.au/scm/aces/aces-apps.gitcdaces-apps gitcheckout{VERSION}If you are planning on developing a new feature it is also recommended that a new branch is created to isolate changes from the main.Then create thecondaenvironment and activate itcondacreate-n{DEV_ENVIRONMENTNAME}python=3.9 condaactivate{DEV_ENVIRONMENTNAME}Unlike the previous set ofcondaenvironment commands, this onewill notuse theenvironment.ymlfile to auto magically install everything.Provided the above has worked, you can then do adeveloper installof theaces-appsbypipinstall-e.The-ewill essentially create a symlink from thesite-packages/acespath to the current location. This will let you edit code from the current location and be picked up when running scripts thatimport aces.Finally, configure theaskaptooling.pipinstall-rrequirements.txt# - OR -pipinstallgit+https://bitbucket.csiro.au/scm/tos/python-askap.git#egg=askap pipinstallgit+https://bitbucket.csiro.au/scm/tos/python-parset.git#egg=askap.parset pipinstallgit+https://bitbucket.csiro.au/scm/tos/python-footprint.git#egg=askap.footprintNotes onmpi4pySome tooling inaces-appsrepositories relies onmpifor code parallelism. On some cray-type supercomputing systems this requires a specialised set ofmpilibraries and compilers. Thegalaxysystem is such a machine. This tooling needs to be loaded into the shell environment before issuing thepipcommand to installaces-apps. On Galaxy, issue these commands:# Extra environment commands for galaxymoduleswapPrgEnv-crayPrgEnv-gnuexportMPICC=/opt/cray/pe/craype/2.5.13/bin/ccSimilarly, onzeusthe following commands are required:# Extra environment commands for zeusmoduleloadintel-mpiexportMPICC=/pawsey/intel/17.0.5/compilers_and_libraries/linux/mpi/intel64/bin/mpiccThese can be combined, and placed into your~/.bashrc:# MPI4PY# Test if machine is galaxymachine=$(hostname)if[[$machine==galaxy*]];thenmoduleswapPrgEnv-crayPrgEnv-gnuexportMPICC=/opt/cray/pe/craype/2.5.13/bin/ccfi# test if machine is zeusif[[$machine==zeus*]];thenmoduleloadintel-mpiexportMPICC=/pawsey/intel/17.0.5/compilers_and_libraries/linux/mpi/intel64/bin/mpiccfiPlease note that these instructions may not be correct or relevant on other HPC machines, including others at Pawsey. Care will need to be taken if attempting to deploy the package on a Cray style system.A sample scriptaces_test_mpi_install.pyis installed alongside this package as a way of verifying anmpiinstallation and configuration should it be needed.Notes from mpfitTaken verbatim from the mpfit documentationThis library contains a few useful routines I wrote or I converted from IDL.My contacts are: Sergey [email protected] of Astronomy, University of Cambridge Madingley road, CB3 0HA, Cambridge, UKIf you have found a bug or have a patch, you can send them to me. With my library I do not promise a stable interface, so beware.The licensing for the programs I wrote myself is GPL3. For all other programs (mainly converted from IDL) I guess the license is either BSD or they are in public domain.Here is the quick list of the functions I implemented: TBW
ace-sklearn-crfsuite
General Informationace-sklearn-crfsuite is a fork ofsklearn-crfsuite, adapted to fit modern versions of scikit-learn.This version went for the strict minimum, and only support pyhton 3.10.License is MIT.How to installThe package is available as apippackage.pip install ace-sklearn-srfsuiteHow to useWe provide atutorialto demonstrate how the package supports the integration of sklearn recent improvements to the existing code base.This tutorial is heavily inspired by the original one available in thesklearn-crfsuitedocumentation.How to contributeThe project usepipenvto allow easy external contributions. Using the–devoption install the development tools (tests, build and deploy)pipenv install –devOne can start an environment will all dependencies satisfied using the following command:pipenv shellInside this environment, to run the tests:python -m pytest tests/test_*.pyTo check code coverage, one need to fiurst run the tests with coveragem, and then ask for a report.coverage run -m pytest tests/test_*.pycoverage reportTo build the code as a deployable package:python -m buildTo upload the freshly built packages to PyPi:twine upload -r testpypi dist/*Remove-r testpypiif the deployment went well, to publish to the real PyPi repository
aces-metric
ACESThis is the repository of Audio Captioning Evaluation on Semantics of Sound (ACES).In here you will find the instructions how to train an ACES model and calculate statistics.Installationpipinstallaces-metricUsageThe candidates can be a list, the references can be a list or a list of lists.fromacesimportget_aces_scorecandidates=["a bunch of birds are singing"]references=["birds are chirping and singing loudly in the forest"]score=get_aces_score(candidates,references,average=True)Semantics of soundsTo get an output of classes of semantic groups from a caption:fromtransformersimportpipelinepipe=pipeline("token-classification","gijs/aces-roberta-13",aggregation_strategy="simple")pipe("Bird chirps in the tree while a car hums")EvaluationAll the code that is used to evaluate different models for the research paper can be found in theevaluationfolder on thegithub. Particularly, the model evaluation can be found inevaluation/eval.py, and information about the FENSE experiment can be found inevaluation/fense_experiment/main.py.
aceso
# aceso [![PyPI version](https://badge.fury.io/py/aceso.svg)](https://badge.fury.io/py/aceso) [![Build Status](https://travis-ci.org/tetraptych/aceso.svg?branch=master)](https://travis-ci.org/tetraptych/unrasterize) [![Coverage Status](https://coveralls.io/repos/github/tetraptych/aceso/badge.svg?branch=master)](https://coveralls.io/github/tetraptych/aceso?branch=master) [![License: MIT](https://img.shields.io/badge/License-MIT-blue.svg)](https://opensource.org/licenses/MIT) Aceso is a lightweight package to calculate 2SFCA and other measures of spatial accessibility.This package shares a name with [the Greek goddess of the healing process](https://en.wikipedia.org/wiki/Aceso). The two go hand-in-hand, since the most common applications of 2SFCA and gravity models of spatial interaction are for measuring spatial access to healthcare services.## Installationacesois available on [PyPI](https://pypi.org/project/aceso/) and can be installed via the following command:` pip install aceso `The only dependency isnumpy.## Gallery![Potential spatial access to abortion clinics in Texas (2017)](https://farm1.staticflickr.com/902/39848822400_197406d944_z_d.jpg“Gravity model using raised cosine decay with 3 hour driving radius”)
acesql
acesql_pypiThis is an project to allow users to have access to databases on ACE platform.UsagefromacesqlimportwriteSQL## Parameters# df: spark dataframe which cantains the data# url: jdbc url - i.e: jdbc:sqlserver://url:portnumber# database: Name of the database to write - i.e.: ace-db# dbtable: schema and abd the name of the table - i.e.: dbo.dbs# username: Username of the database# password: Password of the userwriteSQL(df,url,database,dbtable,username,password)fromacesqlimportreadSQL# url: jdbc url - i.e: jdbc:sqlserver://url:portnumber# database: Name of the database to write - i.e.: ace-db# dbtable: schema and abd the name of the table - i.e.: dbo.dbs# username: Username of the database# password: Password of the userdefreadSQL(url,database,dbtable,username,password):DevelopingTo onstall acesqlm along with the tools you need to develop and run tests, run the following in your vistualenv:pipinstall-e.[dev]
acestock
No description available on PyPI.
acestream
Python AceStreamPython interface to interact with the AceStreamEngine,HTTP APIandSearch API.Installationpip install acestreamUsageimporttimeimportsubprocessfromacestream.serverimportServerfromacestream.engineimportEnginefromacestream.streamimportStream# Create an engine instanceengine=Engine('acestreamengine',client_console=True)# Connect to a remote serverserver=Server(host='streams.com',port=6880)# If the remote server is not available, connect to a local serverifnotserver.available:server=Server(host='127.0.0.1',port=6878)# Start engine if the local server is not availableifnotserver.available:engine.start()# Wait for engine to startwhilenotengine.running:time.sleep(1)# Start a stream using an acestream channel IDstream=Stream(server,id='ff36fce40a7d2042e327eaf9f215a1e9cb622b56')stream.start()# Open a media player to play the streamplayer=subprocess.Popen(['mpv',stream.playback_url])# Wait for player to close and stop the streamplayer.communicate()stream.stop()# Stop acestream engineengine.stop()Searchimporttimeimportrandomimportsubprocessfromacestream.serverimportServerfromacestream.engineimportEnginefromacestream.searchimportSearch# Create an engine instanceengine=Engine('acestreamengine',client_console=True)# Connect to a local serverserver=Server(host='127.0.0.1',port=6878)# Start engine if the local server is not availableifnotserver.available:engine.start(stdout=subprocess.PIPE,stderr=subprocess.PIPE)# Wait for engine to startwhilenotengine.running:time.sleep(1)# Start a search for the sport categorysearch=Search(server,category='sport')search.get(page=1)# Iterate and print search resultsforresultinsearch.results:print("%40s%10s%40s"%(result.name,result.bitrate,result.infohash))# Start a random stream from the search resultsstream=random.choice(search.results).streamstream.start(hls=True,transcode_audio=True)# Open a media player to play the streamplayer=subprocess.Popen(['mpv',stream.playback_url])# Wait for player to close and stop the streamplayer.communicate()stream.stop()# Stop acestream engineengine.stop()ContributingBug reports and pull requests are welcome on GitHub athttps://github.com/jonian/python-acestream.LicensePython AceStream is available as open source under the terms of theGPLv3
acestream-launcher
Acestream LauncherAcestream Launcher allows you to open Acestream links with a Media Player of your choiceDependenciespython, pyhon-acestream, libnotify, acestream-engineSincev1.0.0acestream-launcher usesAcestream Engine HTTP APIthat is available on acestream-enginev3.1or later.Usageacestream-launcherURL[--playerPLAYER][--engineENGINE]Positional argumentsURL The acestream url to playOptional arguments-h, --help Show this help message and exit -p, --player PLAYER The media player command to use (default: mpv) -e, --engine ENGINE The engine command to use (default: acestreamengine --client-console) -t, --timeout TIMEOUT Time in seconds to wait for stream playback (default: 30) -l, --hls Get HLS stream instead of HTTP stream -v, --verbose Show engine and media player output in consoleConfigurationCreate~/.config/acestream-launcher/configfile to override the default arguments. Usettyandbrowsersections to set different options when executing the script from the console or the web browser.[DEFAULT] player = vlc verbose = true timeout = 60 host = 127.0.0.1 port = 6878 [tty] engine = acestreamengine --client-console --log-file /home/jonian/.ACEStream/engine.log [browser] engine = acestreamengine --client-gtk --log-file /home/jonian/.ACEStream/browser.log verbose = falseRequirementsInstall required dependencies (compatible with python 2 and 3):sudoapt-getinstallpythonpython-pipInstall optional dependencies (support for desktop notifications):sudoapt-getinstalllibnotifyInstall Acestream engine manually (you can find actual linkshereand detailed instructionshere):sudoapt-getinstallpython-setuptoolspython-m2cryptopython-apsw wget"http://download.acestream.media/linux/acestream_3.1.49_ubuntu_18.04_x86_64.tar.gz"tarzxvfacestream_3.1.49_ubuntu_18.04_x86_64.tar.gz sudomvacestream_3.1.49_ubuntu_18.04_x86_64/opt/acestream sudosed-i"/ROOT=/c\ROOT=\/opt\/acestream"/opt/acestream/start-engine sudoln-sf/opt/acestream/start-engine/usr/bin/acestreamengineInstall Acestream engine Snap package:sudosnapinstallacestreamplayerInstallationInstall the package with the Python Package Index usingpipcommand.pipinstallacestream-launcherPackagesArch Linux:AUR PackageOpenSUSE:Build Serviceby@DrommerBrowser integrationOnce it is installed, you can set it as default for theacestream://links in your browser. Check your browser preferences for default applications.
acestream-search
acestream-searchProduces acestream m3u playlist, xml epg or json data. You need acestream engine to get it working. If You have no acestream and don't want to install, you can usehttps://github.com/vstavrinov/acestream-serviceinstead of acestream-search.Installation:pip install acestream-searchUsage:acestream_searchBy default You will get long m3u playlist. See help for for more options:acestream_search -h
aceto
Accelerated Charge and Energy Transfer Objects (ACETO) libraryContains Fortran (2003) routines for some common tasks in quantum theory of molecular charge and energy transfer.Preferred pronounciation of the library’s abbreviation is the Italian one. ACETO means vinegar in Italian. The library is designed to add some extra flavour to another project, the Python open quantum system theory package Quantarhei (seehttp://github.com/tmancal74/quantarhei).HOW TO INSTALL ACETOAceto can be installed on Linux and MacInstallation from source:One way of installing is by downloading source code from github.com. You need to configur the Makefile by changing the content of the ‘conf/conf.in’ file to point to a file containing gcc flags (gcc_linux.in and gcc_mac.in files are tested). Of course your system has to have gcc compiler istalled. Then you need to create a ‘lib’ directory in your home directory. This is a temporal fix, but right now, shared library ‘libaceto.so’ is “installed” locally this way. Then you need to issue> make > make installseries of commnads. You will be asked to confirm that a shared library that was created in the ‘lib’ subdirectory can be copied to the ‘lib’ directory in your home directory.Binary installation:The installation procedure of Aceto is still in development. Currently we provide binary distribution for macOS and Linux compiled with gcc complilers through a Python egg awailable from PyPa via the ‘easy_install’ command. Typing> easy_install acetowill instal aceto, but in order for it to run correctly, you have to go to the directory whete aceto is installed (this informatio is displayed during installation by ‘easy_install’) and type> python postinstall.pyYou will be asked to confirm that the ‘libaceto.so’ file can be moved to the directory ‘lib’ in your home directory. If this directory is not present, you have to create it.Linux specific:On Linux it seems that LD_LIBRARY_PATH variable set in qrhei script which is used to run the quantarhei input files (as a subprocess) does not influence the setting for the subprocess. The solution is to export the LD_LIBRARY_PATH in something like .bashrc to point to the ${HOME}/lib directory.
acetolang
No description available on PyPI.
acetone
AboutGlue code removal. Acetone is a python library to provide inversion of control in situation where other methods are inconvenient or they are not even possible. Or you just like the library.UsageCreate the acetone container somewhere in your application:# dependencies.pyfromacetoneimportAcetoneContainerdependencies=AcetoneContainer()# or ioc_container# or lord_of_the_dependencies# or servicesThen use it:# class_with_dependency.pyfromdependenciesimportdependenciesclassClassWithSomeDependency(object):# you can use strings or types as a keydependency=dependencies.Dependency('key')defuse_the_dependency(self):self.dependency.dependency_call('argument')Create a dependency implementation:# dependency_implementation.pyclassDependencyImplementation(object):defdependency_call(self,argument):print(argument)Later register the implementation and run it!# __main__.pyfromdependenciesimportdependenciesfromclass_with_dependencyimportClassWithSomeDependencyfromdependency_implementationimportDependencyImplementationif__name__=='__main__':dependency_implementation=DependencyImplementation()dependencies.register_instance('key',dependency_implementation)instance=ClassWithSomeDependency()instance.use_the_dependency()Or load it from a file:[{"name":"key","module":"dependency_implementation","factory":"DependencyImplementation","singleton":true}]importjsonfromdependenciesimportdependenciesdefmain():withopen('configuration.json')asfile:content=json.load(file)dependencies.load_from_dicts(content)instance=ClassWithSomeDependency()instance.use_the_dependency()Frequently asked questionsHow fast is it?It’s very fast. It’s even faster then a builtin property. The very first dependency access requires some initialization for its own setup and dependency creation (provided it was not created before), but the subsequent calls are as fast as a member instance access. Dependencies use a descriptor protocol (used by@property), they are initialized lazily and once fetched from the container they are set as a normal instance member (class member in case of ClassDependency). This trick is used by several frameworks (for example werkzeuq cached_property).How do I mock it?Technically you can mock it, but I don’t think it’s necessary. The container is simple and well tested. Its purpose is to provide a requested dependency and the dependency can be a mock as well. You can just consider it as an essential part of your code and not mock it to your advantage (would you mock properties?).classTestXyz(TestCase):deftearDown(self):container.clean()Traditionalists wouldn’t agree for sure but Python wasn’t created by traditionalists in the first place.Are there any requirements?No external dependencies. For the class used the only requirement is that the class has to be a normal python class with__dict__. In other words it can’t use__slots__.
aceui
AceUIAceUI是基于Selenium的UI自动化测试框架。通过poetry工具打包poetry buildpoetry config repositories.testpypihttps://pypi.org/project/aceuipoetry publish 输入pypi用户名和密码
ace-upload
ACE File UploadUpload files to MinIO for use in brewlyticsSetup procedures:In a terminal/console:pip install ace-uploadOnce ace_upload is installed declare:python -m ace_uploadInput procedures:Default location of connection: minio.dev.caoc.armyRequires a user provide an access key and password.User will be prompted for their firstname initial and lastname.The file path provided should be Absolute. All files and subdirectory files will be picked up.User provides a name of the intended cloud director to upload to.Input Y to proceed and upload or N to cancel the upload process.
aceye
ACEyeBusiness Ready Documents for Cisco ACICurrent API CoverageAccess Bundle GroupsAccess Control EntitiesAccess Control InstancesAccess Control RulesAccess Control ScopeAccess Policy Group Source RelationshipsAccess Port GroupsAccess Port ProfilesApplication ProfilesARP Adjacency EndpointsARP DatabaseARP DomainARP EntityARP InstancesARP InterfacesAttachable Access Entity ProfilesAttachable Access Entity Profiles Source RelationshipsBGP Address FamiliesBGP DomainsBGP EntitiesBGP InstancesBGP Instances PolicyBGP PeersBGP Peers AF EntriesBGP Peers EntriesBGP Route Reflector PolicyBGP Route ReflectorsBridge DomainsBridge Domains Target RelationshipsBridge Domains To OutsideCDP Adjacency EndpointsCDP EntitiesCDP InstancesCDP Interface AddressesCDP InterfacesCDP Management AddressesCluster Aggregate InterfacesCluster HealthCluster Physical InterfacesCluster RS Member InterfacesCompute ControllersCompute DomainsCompute Endpoint Policy DescriptionsCompute ProvidersCompute RS Domain PoliciesContext Source RelationshipsContexts (VRFs)Contexts Target RelationshipsContractsContract Consumer InterfacesContract ConsumersContract Consumers RootContract ProvidersContract Providers RootContract SubjectsContract Subjects Filter AttributesControllersDevice PackagesDomain AttachmentsDomain Profile Source RelationshipsEndpoint Profile ContainersEndpoints (All Connected Fabric Endpoints)Endpoints To PathsEPG to Bridge Domain LinksEPGs (Endpoint Groups)Equipment Board SlotsEquipment BoardsEquipment ChassisEquipment CPUsEquipment DIMMsEquipment Fabric ExtendersEquipment Fabric PortsEquipment Fan SlotsEquipment Fan TraysEquipment FansEquipment Field Programmable Gate ArraysEquipment Indicator LEDsEquipment Leaf PortsEquipment Line Card SlotsEquipment Line CardsEquipment Port Locator LEDsEquipment Power SuppliesEquipment Power Supply SlotsEquipment RS IO Port Physical ConfigsEquipment SensorsEquipment SP Common BlocksEquipment SPROM LCsEquipment SPROM Power SuppliesEquipment SPROM Power Supply BlocksEquipment SPROM SupervisorsEquipment StorageEquipment Supervisor SlotsEquipment SupervisorsEthernet Port Manager Physical Interfaces*EventsExternal Unmanaged NodesExternal Unmanaged Nodes InterfacesFabric Extended Path Endpoint ContainersFabric InstancesFabric Link ContainersFabric LinksFabric Loose LinksFabric Loose NodesFabric MembershipFabric Node SSL CertifcatesFabric NodesFabric Path Endpoint ContainersFabric Path EndpointsFabric PathsFabric PodsFabric Protected Path Endpoint ContainersFault SummaryFEX PoliciesFibre Channel EntityFirmware Card RunningFirmware Compute RunningFirmware RunningFunction PoliciesHealthHost Port SelectorsInterface PoliciesInterface ProfilesIP AddressesIPv4 AddressesIPv4 DomainsIPv4 EntitiesIPv4 InstancesIPv4 InterfacesIPv4 Next HopIPv4 RoutesISIS Adjacency EndpointsISIS Discovered Tunnel EndpointsISIS DomainsISIS Domains LevelISIS EntitiesISIS InstancesISIS InterfacesISIS Interfaces LevelISIS Next HopISIS RoutesL2 Bridge DomainsL2 EPG Bridge Domain Source RelationshipsL2 External Instance ProfilesL2 External InterfacesL2 External Logical Interface ProfilesL2 External Logical Node ProfilesL2 Interface Source RelationshipsL2Out PathsL2OutsL3 ContextsL3 Contexts Source RelationshipsL3 DomainsL3 Domains Source RelationshipsL3 InstancesL3 InterfacesL3 Logical Interface ProfilesL3 Logical Node ProfilesL3 Physical Interface Source RelationshipsL3 Routed InterfacesL3 Routed Loopback InterfacesL3 SubinterfacesL3 SubnetsL3Out IP AddressesL3Out MembersL3Out Node Source RelationshipsL3Out Path Source RelationshipsL3Out ProfilesL3OutsLACP EntitiesLACP InstancesLACP InterfacesLeaf Interface ProfilesLeaf Switch ProfilesLicense EntitlementsLLDP Adjacency EndpointsLLDP EntitiesLLDP InstancesLLDP InterfacesLocalesManagement InterfacesOSPF Adjacency EndpointsOSPF AreasOSPF DatabaseOSPF DomainsOSPF EntitiesOSPF External ProfilesOSPF InstancesOSPF InterfacesOSPF RoutesOSPF Unicast Next HopPath AttachmentsPhysical DomainsPhysical InterfacesPort BlocksPort Channel Aggregate InterfacesPort Channel Member InterfacesPrefix ListPrefix List DetailedQOS ClassesRoute PoliciesSecurity DomainsSpine Access Policy GroupsSpine Access Port ProfilesSpine Host Port SelectorsSpine Interface ProfilesSpine Switch ProfilesStatic Route Next Hop PoliciesSubnetsSVIsTenantTenant HealthTop SystemTunnel InterfacesUnicast Route DatabaseUnicast Route DomainsUnicast Route EntitiesUnicast Route Next HopUnicast RoutesUsersVLAN Encapsulation BlocksVLAN Endpoint Group EncapsulationVLAN Namespace PoliciesVLAN Namespace Source RelationshipsVLAN PoolsVMM Controller ProfilesVMM Domain ProfilesVMM Provider ProfilesVMM User ProfilesVPC ConfigurationsVPC DomainsVPC EntitiesVPC InstancesVPC InterfacesvzAnyvzAny To ConsumersvzAny To ProvidersvzDeny RulesvzEntriesvzFiltersvzInterface Source RelationshipsvzRule OwnervzTabooWired NodesBoth Audit Log and Events are commented out of the base package due to the potentially huge number of records; should you want the Audit Log / Events please uncomment out lines 72-73 (Audit Log) and 76-77 (Events)Installation$python3-mvenvACI$sourceACI/bin/activate(ACI)$pipinstallaceyeUsage - Help(ACI)$aceye--helpUsage - In-line(ACI)$aceye--url<urltoAPIC>--username<APICusername>--password<APICpassword>Usage - Interactive(ACI)$aceyeAPIC URL: <URL to APIC>APIC Username: <APIC Username>APIC Password: <APIC Password>Usage - Environment Variables(ACI)$exportURL=<URLtoAPIC>(ACI)$exportUSERNAME=<APICUsername>(ACI)$exportPASSWORD=<APICPassword>Recommended VS Code ExtensionsExcel Viewer - CSV FilesMarkdown Preview - Markdown FilesMarkmap - Mindmap FilesOpen in Default Browser - HTML FilesContactPlease contact John Capobianco if you need any assistance
acf
AcfA lightweight recommender engine for implicit feedback datasetsThe package implements an algorithm described inCollaborative Filtering for Implicit Feedback Datasetspaper. The algorithm is based on the following ideas:using collaborative filtering with latent factorstransforming feedback observations into binary preferences with associated confidence levelsusing alternating least sqaures to compute the matrix factorizationInstallThe package requires Python3.7or newer, the only dependencies arenumpyandpandas. To install it, runpipinstallacfUsageThe following example shows how to train a model and compute predictions.importacfimportpandasaspd# assuming the data are in the following format:# | user_id | item_id | feedback |# |---------|---------|----------|# | 2491 | 129 | 2 |interactions=pd.read_csv('interactions.csv')engine=acf.Engine(reg_lambda=1,alpha=35,n_factors=2,random_state=0)engine.fit(interactions,user_column='user_id',item_column='item_id',feedback_column='feedback',n_iter=20,n_jobs=4)# get the best 20 recommendationsprediction=engine.predict(user=2491,top_n=20)# to print training loss value at every iterationprint(engine.loss)Model EvaluationFor performance evaluation, the package offersmetrics.mean_rankfunction that implements "mean rank" metric as defined by equation 8 in thepaper.The metric is a weighted mean of percentile-ranked recommendations (rank_ui = 0says that itemiis the first to be recommended for useruand itemjwithrank_uj = 1is the last to be recommended) where the weights are the actual feedback values fromRuser-item matrix.interactions_test=pd.read_csv('intercations_test.csv')print(acf.metrics.mean_rank(interactions=interactions_test,user_column='user_id',item_column='item_id'feedback_column='feedback',engine=engine))Model PersistenceTrained model can be serialized and stored usingjobliborpickle.To store a model:withopen('engine.joblib','wb')asf:joblib.dump(engine,f)To load a model:withopen('engine.joblib','rb')asf:engine=joblib.load(f)Public APIacf.Engineacf.core.computation.Engine(reg_lambda=0.1,alpha=40,n_factors=10,random_state=None):Class exposing the recommender.reg_lambda: regularization strengthalpha: gain parameter in feedback-confidence transformationc_ui = 1 + alpha * r_uin_factors: number of latent factorsrandom_state: initial RNG stateProperties:user_factors: user factor matrixitem_factors: item factor matrixloss: training loss historyMethods:Engine.fit(interactions,user_column,item_column,feedback_column,n_iter=20,n_jobs=1)Trains the model.interactions: dataframe containing user-item feedbacksuser_column: name of the column containing user idsitem_column: name of the column containing item idsfeedback_column: name of the column containing feedback valuesn_iter: number of alternating least squares iterationn_jobs: number of parallel jobsEngine.predict(user,top_n=None)Predicts the recommendation.user: user identification for whom the prediction is computedtop_n: if notNone, only the besr n items are included in the resultReturns:predicted recommendation score for each item aspandas.Seriesacf.metrics.mean_rankacf.core.metrics.mean_rank(interactions,user_column,item_column,feedback_column,engine)Computes mean rank evaluation.interactions: dataframe containing user-item feedbacksuser_column: name of the column containing user idsitem_column: name of the column containing item idsfeedback_column: name of the column containing feedback valuesengine: trainedacf.EngineinstanceReturns:computed valueTestsTests can be executed bypytestaspython-mpytestacf/tests
acfile
No description available on PyPI.
acfr
ACR – Angle Correction for RotatorsSimple script to load data from a rotator and correct for common errors in a rotator setup.DependenciespythonnumpymatplotlibPrerequisitesRotation should be performed at otherwise stable conditions.Only one rotation per fileTo correct for backlash a forward and backward rotation has to be performed.To correct for slip ideally at least a 360° rotation has to be performedTo correct for an initial phase in a meaningful way, you have a look at your sample when installing it.UsageRunpythonacr.py-iinputfile.dat-ooutputfile.acr.datYou will get a window like this:Then adjust your parameters in the GUI.Use this at your own risk and check the contents of the script beforehand.
acfun
No description available on PyPI.
acfun-get
No description available on PyPI.
acfunsdk
acfunsdk -UNOFFICEICALacfunsdk是非官方的AcFun弹幕视频网Python库。声明:acfunsdk是python的学习工具,并未破解任何acfun相关内容。代码完全公开,仅用于交流学习。 如涉及版权等相关问题,请遵守acfun相关协议及法律法规。如有bug或其他疑问,欢迎发布issues。Python:Python>=3.8, 本体请自行下载安装。从PyPI安装python-mpipinstallacfunsdk使用方法实例化获取对象fromacfunsdkimportAcer# 实例化一个Aceracer=Acer()# 登录用户(成功登录后会自动保存 '<用户名>.cookies')# 请注意保存,防止被盗acer.login(username='[email protected]',password='balalabalala')# 读取用户(读取成功登录后保存的 '<用户名>.cookies')acer.loading(username='13800138000')# 每日签到,领香蕉🍌acer.signin()# 通过链接直接获取内容对象# 目前支持 9种内容类型:# 视 频: https://www.acfun.cn/v/ac4741185demo_video=acer.get("https://www.acfun.cn/v/ac4741185")print(demo_video)# 文 章: https://www.acfun.cn/a/ac16695813demo_article=acer.get("https://www.acfun.cn/a/ac16695813")print(demo_article)# 合 集: https://www.acfun.cn/a/aa6001205demo_album=acer.get("https://www.acfun.cn/a/aa6001205")print(demo_album)# 番 剧: https://www.acfun.cn/bangumi/aa5023295demo_bangumi=acer.get("https://www.acfun.cn/bangumi/aa5023295")print(demo_bangumi)# 个人页: https://www.acfun.cn/u/39088demo_up=acer.get("https://www.acfun.cn/u/39088")print(demo_up)# 动 态: https://www.acfun.cn/moment/am2797962demo_moment=acer.get("https://www.acfun.cn/moment/am2797962")print(demo_moment)# 直 播: https://live.acfun.cn/live/378269demo_live=acer.get("https://live.acfun.cn/live/378269")print(demo_live)# 分 享: https://m.acfun.cn/v/?ac=37086357demo_share=acer.get("https://m.acfun.cn/v/?ac=37086357")print(demo_share)# 涂鸦(单页): https://hd.acfun.cn/doodle/knNWmnco.htmldemo_doodle=acer.get("https://hd.acfun.cn/doodle/knNWmnco.html")print(demo_doodle)依赖库依赖: 包含在requirements.txt中httpx>=0.23lxml>=4beautifulsoup4>=4参考 & 鸣谢AcFun 助手是一个适用于 AcFun( acfun.cn ) 的浏览器插件。AcFunDanmaku是用C# 和 .Net 6编写的AcFun直播弹幕工具。实现自己的AcFun直播弹幕姬@財布士醬QQ频道“AcFun开源⑨课”使用Poetry构建About Me♂ 整点大香蕉🍌
acfunsdk-ws
acfunSDK - websocketacfunsdk是非官方的AcFun弹幕视频网Python库。acfunsdk-ws是acfunsdk的附属组件,提供websocket通信支持。依赖库依赖: 包含在requirements.txt中acfunsdk>=0.9.7WebSocket通信及数据处理:websocket-client>=1.4pycryptodome>=3.15protobuf==3.20.1proto-plus==1.22.1filetype>=1.1内置+修改:blackboxprotobufAbout Me♂ 整点大香蕉🍌
acfun-upload
AcFun 投稿工具基于 Python 的命令行投稿工具安装pipinstallacfun_upload使用fromacfun_uploadimportAcFunacfun=AcFun()acfun.login(username="",password="")acfun.create_douga(...)create_douga 参数参数注释是否必须类型file_path视频文件路径,建议绝对路径是strtitle稿件标题是strchannel_id频道 ID,查看:频道 ID 汇总是intcover视频封面图片路径,建议绝对路径是strdesc稿件简介否strtags稿件标签否listcreation_type创作类型 1:转载 3:原创,默认1是intoriginalLinkUrl转载来源否strLicenseGNU General Public License v3.0
acg
UNKNOWN
acgaws
No description available on PyPI.
ac-gb-distributions
No description available on PyPI.
acgc
ACGCOverviewThe acgc package is a collection of data analysis functions used by the Atmospheric Chemistry and Global Change Research Group (ACGC). Programs are written in Python 3.InstallationFor conda users:conda install -c conda-forge acgcFor pip users:pip install acgcFor developersIf you plan to modify or improve the acgc package, an editable installation may be better:pip install -e git+https://github.com/cdholmes/acgc-pythonYour local files can then be managed with git, including keeping up-to-date with the github source repository (e.g.git pull).Classic versionThe old version of this package (before conversion to an importable python module) is accessible as the "classic" branch of this repository on github.DocumentationSeehttps://cdholmes.github.io/acgc-pythonDemosThedemofolder contains examples of how to accomplish common data analysis and visualization tasks, including using many of the functions within theacgclibrary.
acgen-by-den
No description available on PyPI.
acg-feature-extractor
Feature Extractor Lite特征抽取引擎,统一深度学习训练与预测快速开始如何构建、安装、运行测试如何执行自动化测试如何贡献贡献patch流程及质量要求版本信息本项目的各版本信息和变更历史可以在这里查看。维护者ownersdongdaxiang([email protected])committersdongdaxiang([email protected])讨论百度Hi交流群:群号Changelog以下记录了项目中所有值得关注的变更内容,其格式基于Keep a Changelog。本项目版本遵守Semantic Versioning和PEP-440。UnreleasedAdded这里记录新添加的内容Changed这里记录变更的内容0.1.0 - 2020-11-27Added创建项目
acglib
# acglib A small python package of reusable code bits for AWS, Azure, and GCP. Probably nothing you’ll want to use.
acgv1signer
Note:BaiduCloud V1-auth,https://cloud.baidu.com/doc/Reference/AuthenticationMechanism.html
ach
python-achACH file generator module for python. So far, this has been tested with “PPD” and “CCD” batches with addenda records.ExampleBelow is an example of how to use the module:fromach.builderimportAchFilesettings={'immediate_dest':'123456789',# Your bank's routing number'immediate_org':'123456789',# Bank assigned routing number'immediate_dest_name':'YOUR BANK','immediate_org_name':'YOUR COMPANY','company_id':'1234567890',#tax number}ach_file=AchFile('A',settings)#file Id modentries=[{'type':'22',# type of'routing_number':'12345678','account_number':'11232132','amount':'10.00','name':'Alice Wanderdust','addenda':[{'payment_related_info':'Here is some additional information',},],},{'type':'27','routing_number':'12345678','account_number':'234234234','amount':'150.00','name':'Billy Holiday',},{'type':'22','routing_number':'12323231','account_number':'123123123','amount':'12.13','name':'Rachel Welch',},]ach_file.add_batch('PPD',entries,credits=True,debits=True)printach_file.render_to_string()This returns the following NACHA file:101 123456789 1234567891407141745A094101YOUR BANK YOUR COMPANY 5220YOUR COMPANY 1234567890PPDPAYROLL 140714 1123456780000001 62212345678011232132 0000001000 ALICE WANDERDUST 1123456780000001 705HERE IS SOME ADDITIONAL INFORMATION 00000000001 622123456780234234234 0000015000 BILLY HOLIDAY 0123456780000002 622123232315123123123 0000001213 RACHEL WELCH 0123456780000003 822000000400370145870000000000000000000172131234567890 123456780000001 9000001000001000000040037014587000000000000000000017213 9999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999 9999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999
achaekek
achaekekManifold markets Python API
achakra
No description available on PyPI.
achallonge
achallongeasync Challonge for Python 3.5+Modern library that is more than just a wrapper for the Challonge web APIRequirementsaiohttpOptional:cchardetfaster replacement for chardet, as mentionned on the aiohttp pageaiodnsfor speeding up DNS resolving, highly recommended by aiohttpPython version support3.53.63.7Installationpip install achallongeIf you want to have the optional dependencies for aiohttp, you can:pip install achallonge[speed]Usageimportchallongeasyncdeffoo():# Log in into Challonge with your CHALLONGE! API credentials (https://challonge.com/settings/developer).user=awaitchallonge.get_user('your_challonge_username','your_api_key')# Retrieve your tournamentstournaments=awaituser.get_tournaments()# Tournaments, matches, and participants are all represented as Python classesfortintournaments:print(t.id)# 3272print(t.name)# 'My Awesome Tournament'print(t.status)# 'open'# Retrieve the participants for a given tournament.participants=awaittournaments[0].get_participants()print(len(participants))# 13DocumentationThe full documentation can be found onRead the docsAuthor / LicenseDistributed under MIT license. SeeLICENSEfor detailsFabien Poupineau (fp12) - 2017-2019Twitter:@fp12gamingJoin theDiscord Serverand discuss about this lib!
achallonge-compat-fork
achallongeasync Challonge for Python 3.5+Modern library that is more than just a wrapper for the Challonge web APIRequirementsaiohttpOptional:cchardetfaster replacement for chardet, as mentionned on the aiohttp pageaiodnsfor speeding up DNS resolving, highly recommended by aiohttpPython version support3.53.63.7Installationpip install achallongeIf you want to have the optional dependencies for aiohttp, you can:pip install achallonge[speed]Usageimportchallongeasyncdeffoo():# Log in into Challonge with your CHALLONGE! API credentials (https://challonge.com/settings/developer).user=awaitchallonge.get_user('your_challonge_username','your_api_key')# Retrieve your tournamentstournaments=awaituser.get_tournaments()# Tournaments, matches, and participants are all represented as Python classesfortintournaments:print(t.id)# 3272print(t.name)# 'My Awesome Tournament'print(t.status)# 'open'# Retrieve the participants for a given tournament.participants=awaittournaments[0].get_participants()print(len(participants))# 13DocumentationThe full documentation can be found onRead the docsAuthor / LicenseDistributed under MIT license. SeeLICENSEfor detailsFabien Poupineau (fp12) - 2017-2019Twitter:@fp12gamingJoin theDiscord Serverand discuss about this lib!
achan-test
No description available on PyPI.
achat
achatachat is an asynchronous wrapper over thechatsonic apifor the python programming language.Featuresasynchronous supportmessage history supportconvenient customization of all query parametersInstallationTo install the latest stable version, usepip install achatExamplesUsing chat instanceimportasynciofromachatimportSonicChatasyncdefmain():chat=SonicChat(token="your api token here")try:awaitchat.start()response=awaitchat.ask("Who is Elon Musk?")print(response.message)response=awaitchat.ask("When was he born?")print(response.message)finally:awaitchat.close()asyncio.run(main())Using async context managerimportasynciofromachatimportSonicChatasyncdefmain():asyncwithSonicChat(token="your api token here")aschat:response=awaitchat.ask("Who is Elon Musk?")print(response.message)response=awaitchat.ask("When was he born?")print(response.message)asyncio.run(main())DocumentationComing soonDependenciesaiohttp
achat_client_jul
No description available on PyPI.
achat_srv_jul
No description available on PyPI.
ach-calc
A package to perform arithmetic operations
acheck
A tool support tool for inspection, validation and correction of behavioral annotations. For more information, visit the project website:https://github.com/vac-mmis/CausalAnnotationCorrectionThe tool requires Python 3.8 or higher.How to use it?Install the package using pip:$ pip install acheckAfter the installation, use:$ acheck check $domain.pddl $problem.pddl $annotation.csvBy default, it will automatically start a local server on127.0.0.1:9000.Options:-l $file_1 $file_2 $file_n: Enter one or multiple annotation files or the domain or problem file to lock them in editor$ acheck check domain.pddl problem.pddl anno_1.csv -l domain.pddl anno_1.csv-o $directory: Enter a custom directory for all output files.$ acheck check domain.pddl problem.pddl anno_1.csv -o project/output-p port: Specify theporton which the server is running.$ acheck check domain.pddl problem.pddl anno_1.csv -p 8000-v: Enable verbose output.-m directory: Enter a customdirectoryto load multiple annotations--nogui: For command line only use.--inplace: Work with the original files. For command line only use without backup.EnchantThe tool usespyenchantfor spell checking. It is a spellchecking library for Python, based on theEnchantlibrary. In order to work properly you will need to install the Enchant C library manually:MacOS:$ brew update $ brew install enchantTo avoid problems, restart your system after installation.Linux:The quickest way is to installlibenchantusing the package manager of your current distribution.To detect the libenchant binaries, PyEnchant usesctypes.util.find_library(), which requiresldconfig,gcc,objdumporldto be installed. This is the case on most major distributions, however statically linked distributions (like Alpine Linux) might not bring alongbinutilsby default.To avoid problems, restart your system after installation.$ pacman -S enchant $ sudo apt-get install enchant-2If you experience any issues, you may want to have a look at:https://pyenchant.github.io/pyenchant/install.htmlIt can happen that with the enchant installation, no providers are installed that enable the spell check. For this you can install the desired provider yourself:$ pacman -S aspell $ sudo apt-get install aspellAlso the desired languages can be installed in this way:$ pacman -S aspell-en $ pacman -S aspell-de $ sudo apt-get install aspell-en $ sudo apt-get install aspell-deIf you experience any issues, you may want to have look at:https://pyenchant.github.io/pyenchant/install.html#installing-a-dictionaryThe standard language for the spell checker is English (en_US). You need to configure the language with the following command:$ acheck config -l $languageSupported languages (at the moment):en_USen_GBde_DEPlan ValidationIf you want to enable plan validation, you need to download theKCL Validator. Just follow the instructions on the GitHub page and download the binaries for you operating system. To get to the download, just click on theAzure Pipelinesbutton at the beginning of theREADME.md. Save the binaries in a location that suits you. After downloading, you need to configure the path to the validator executable file.bin/Validateorbin/Validate.exeIn order to use the plan validator correctly, stop the tool and set the path like:$ acheck config -v $path-to-validate-executableExamples:$ acheck config -v /Users/macos64/Val-20211204.1-Darwin/bin/Validate $ acheck config -v /Users/linux/Val-20211204.1-Linux/bin/Validate $ acheck config -v /Users/windows/Val-20211204.1-Windows/bin/Validate.exe