package
stringlengths
1
122
pacakge-description
stringlengths
0
1.3M
aamp-app
AAMP User InterfaceAn app to control the automated additive manufacturing platform.InstallationTo use this app, you need to install Python (version 3.10 or higher).In a terminal window in a new directory, runpip install aamp_apporpip install --upgrade aamp_appto install the app. Use the latter one to upgrade the app after it's installed. All dependencies will be installed with it. You may wish to use a virtual python environment if you don't want this app or its dependencies to interfere with your current system. On Mac or Linux-based systems, do this by running the following commands before installing:pip install virtualenv: This line installs the virtualenv package which allows you to create virtual python environments.virtualenv venv: This line creates a virtual environment (calledvenv) in the current directory. This will create a new folder (calledvenv) with the environment data.source venv/bin/activate: This line activates the virtual environment. After activating the virtual environment, you should be able to see(venv)in your terminal window. If you close the terminal window/tab, you will have to execute this command to activate the environment again before using the app.UsageTo start the app, runaamp_app. Enter the database user credentials for MongoDB. These credentials will be saved as plain text in a text file (calledpw.txt) in the same directory. Therefore, this app should only be used on trusted computers. If the connection to the database cannot be established using the provided credentials, you will be required to runaamp_appagain to retry. To delete the user credentials, simply delete thepw.txtfile.Device RequirementsMost devices should be able to connect to the app without any problems. However, certain devices require some drivers/software to create a connection. Due to the nature of the specific drivers/software, they must be installed separately.Thorlabs Devices (APT or Kinesis)Miscellaneous Commands (for dev)Package app:python3 setup.py sdist bdist_wheelUpload to pip:twine upload --skip-existing dist/*
aamras
.. image::https://github.com/allekmott/aamras/workflows/build/badge.svg:target:https://github.com/allekmott/aamras/workflows/build/badge.svg:alt: build status.. image::https://readthedocs.org/projects/aamras/badge/?version=latest:target:https://aamras.readthedocs.io/en/latest/?badge=latest:alt: documentation status.. image::https://img.shields.io/pypi/pyversions/aamras.svg:target:https://pypi.python.org/pypi/aamras:alt: supported python versionsaam rasorआम रस- Hindi for mango juice 🥭.It is pronounced "arm Russ" if therinarmis silent.aamras provides a high-level interface for headless browser manipulation. It is currently built onselenium <https://github.com/SeleniumHQ/selenium>_.Requirementsaamrasrequires Python 3.6 or above.InstallationThe baseaamraspackage may be installed using pip:.. code-block::pip install aamrasAdditional drivers (webdrivers) are also required for the actual browser manipulation side of things. Thewebdrivermanager CLI tool <https://github.com/rasjani/webdrivermanager>_ can streamline the installation process, but otherwise, there's always the core documentation for the drivers:For Firefox -geckodriver <https://firefox-source-docs.mozilla.org/testing/geckodriver/>_For Chrome -chromedriver <https://sites.google.com/a/chromium.org/chromedriver/getting-started>_
aamrd
OverviewJob scheduler abstraction for computation job submission and query.LicenseGNU Affero General Public License
aamt
AAMT 项目模版用于生成 基于pytest的接口自动化脚手架python 版本3.9安装最新版本pip install aamt指定版本安装pip install aamt==0.2.5升级aamtpip install -U aamt创建项目脚手架aamt startproject demo创建项目脚手架(自动创建虚拟环境)aamt startproject demo -venv外网速度慢,pandas可能安装失败,推荐用国内镜像pip --default-timeout=6000 install -ihttps://pypi.tuna.tsinghua.edu.cn/simpleaamt
aanalytcsseg
Automation Tool for Team ACT
aanalytics2
Adobe Analytics API v2.0This is a python wrapper for the adobe analytics API 2.0.DocumentationMost of the documentation for this API will be hosted atdatanalyst.info.Getting Started details on Github.Appendix for running on a serverVersionsA documentation about the releases information can be found here :aanalytics2 releasesFunctionalitiesFunctionalities that are covered :Reporting APIRun a report statementRetrieve UsersRetrieve SegmentsRetrieve MetricsRetrieve DimensionsRetrieve Calculated MetricsRetrieve Virtual Report SuitesRetrieve Virtual Report Suite Curated ComponentsRetrieve TagsRetrieve Usage Logs from usersRetrieve ProjectsRetrieve Scheduled Jobs / ProjectsUpdate SegmentUpdate Calculated MetricUpdate TagsUpdate ProjectDelete SegmentDelete CalculatedMetricDelete VirtualReportSuiteDelete ProjectDelete DateRangeCreate a ProjectCreate a Scheduling job for a Workspace Projectdocumentation on reportinghereData Ingestion APIsData Ingestion API from API 1.4Bulk Data Insertion APIdocumentation on ingestion APIshereLegacy Analytics API 1.4This module provide limited support for the 1.4 API. It basically wrapped your request with some internal module and you can pass your request path, method, parameters and / or data. More information in thededicated documentation for 1.4RequestCreator classTheaanalytics2module enables you to generate request dictionary for the getReport method easily.You will have no need to go to the UI in order to create a report template JSON anymore.Do it automatically from the python interface. More information on theRequestCreatordocumentationProject DataThere is a feature to retrieve the Workspace projects and the components used.Refer to thisdocumentation on Projectfor more information.Logging capabilityIn case you want to use the logging capability for your script.You can look at the reference for this on thelogging documentation pageGetting StartedTo install the library with PIP use:pip install aanalytics2orpython -m pip install --upgrade git+<https://github.com/pitchmuc/adobe_analytics_api_2.0.git#egg=aanalytics2>DependenciesIn order to use this API in python, you would need to have those libraries installed :pandasrequestsjsonPyJWTPyJWT[crypto]pathlibdicttoxmlpytestTestA test support has been added with pytest. The complete documentation to run the test can be found here :testing aanalytics2Others SourcesYou can find information about the Adobe Analytics API 2.0 here :https://adobedocs.github.io/analytics-2.0-apishttps://github.com/AdobeDocs/analytics-2.0-apis/blob/master/reporting-guide.md
aanalyticsact
Adobe Analytics for Team ACT v.0.0.1This module has been built to provide a better environment specifically for adobe analysts in Team ACT.This module itself would not contain any confidential information & nor is incapable of retrieving any data unless the correct API credential is given. The guide of pulling data will be only provided internally, no inquires will be accepted from outsiders.FunctionalitiesRetrieve Daily, Weekly, Monthly data and automatically dumps into the DB connected in their local machine.Capable up to 2nd breakdown.Automatically retrieves data of all Reporting Suites.Update SegmentsCreate SegmentsMore functionalities will be added in sooner time.
aanalyticsactauth
package for aanalyticsact
aanalyticssegg
This is the automation tool for team ACT
aanchal-task
No description available on PyPI.
aang
AANGA PyPi module to set the user's background to AANG.PNGHow does it work?Simple: just import aangimportaangInstallationOn bash:$pipinstallaangOn windows:py -m pip install aangManually (NOT RECOMMENDED)Run AANG.PY found ataang.pyLicenseThis tool is open source under theMIT Licenseterms.[Back To Top]
aao
# Against all Odds[![Build Status](https://travis-ci.com/S1M0N38/aao.svg?branch=master)](https://travis-ci.com/S1M0N38/aao) [![Codacy Badge](https://api.codacy.com/project/badge/Grade/840c1f6d7dba42e9b6fcbe8973008dcc)](https://www.codacy.com/app/S1M0N38/aao?utm_source=github.com&amp;utm_medium=referral&amp;utm_content=S1M0N38/aao&amp;utm_campaign=Badge_Grade)Aganistallodds project aim to offert an environment where everyone can test his betting strategy on real odds. You can get the odds in two way: throughmanual scrapingor using theapi.## Installationto install all againstallodds open your terminal and type` pip install aao `If you plan to use the manual scrape you must also install chrome and chromedrivers as well. Search on google how to do on your operative system.## DocsAvaiable athttps://s1m0n38.github.io/aao/
aa-opcalendar
Operation CalendarAn operation calendar app for Alliance Auth to display fleet operations and other events.Includes:Calendar type view of different eventsManual eventsUser createdDefailed viewIcal feed for exporting eventsPublic NPSI eventsAutomatic syncing with supported NPSI community events over APIsIngame eventsAutomatic syncing with ingame eventsPersonal, Corporation and Alliance calendarsSupportsstructure timersSupportsaa-moonminingSupportsaa-discordbotEvent visibility optionsCustom names and colorsRestrict to groupsRestrict to statesWebhook for sending event notificationsFilter to include in ical feedEvent categoriesCustom namesCustom tickersCustom colorsPre-fill text to add on events with the categoryMultihost supportDiscord notificationsWebhookFor: new, edited and deleted eventsDark and white themesaa-moonmining supportaa-structuretimers supportCustom event visibility filters and categoriesdetails for manual eventsSupports importing public NPSI events right into opcalendarPull ingame events from personal, corporation and allianceDiscord feed based on visibility filterSupports aa-discordbot to fetch events over discordInstallationInstall the Repopip install aa-opcalendarAdd'opcalendar',to yourINSTALLED_APPSin your projectslocal.pyRun migrationspython manage.py migrateCollect static filespython manage.py collectstaticRestart supervisorsupervisorctl reload myauth:Setup permissionsPermissionsPermAuth SiteExample Target Groupopcalendar basic_accessCan access this app and see operations based on visibility rulesEveryoneopcalendar create_eventCan create and edit eventsLeadership, FCsopcalendar manage_eventCan delete and manage signupsLeadership, FCsopcalendar see_signupsCan see all signups for eventLeadership, FCs, MembersSettingsNameDescriptionDefaultOPCALENDAR_NOTIFY_IMPORTSWheter to send out discord notifications for ingame and public NPSI eventsTrueOPCALENDAR_DISPLAY_STRUCTURETIMERSwhether we should inculde timers from the structuretimers plugin in the calendar. Inherits view permissions from aa-structuretimersTrueOPCALENDAR_DISPLAY_MOONMININGwhether we should inculde extractions from the aa-moonmining plugin in the calendar. Inherits view permissions from aa-moonminingTrueOPCALENDAR_DISCORD_OPS_DISPLAY_EXTERNALwhether we display external hosts such as ingame hosts in the discord ops command filtersFalseOPCALENDAR_DISPLAY_MOONMINING_TAGSDisplay the rarity tag of aa-moonmining moons if the moonmining plugin is installedTrueOPCALENDAR_DISPLAY_MOONMINING_ARRIVAL_TIMEDisplays aa-moonmining extraction time based on arrival time. Set to False to display as auto fracture timeTrueSetupBefore you are able to create new events on the front end you will need to setup the needed categories and visibility filters for your events.1. HostHosts are for identifying reasons. If you run a single corporation or alliance entity you most likely only want one host. If you want to extend the calendar with other hosts such as NPSI communities you can create a host for each different entity.Host name is shown on the event and on discord notificationsYou can customize host logosGo to the admin site2. Visibility filterThese filters will determine who is able to see the events that are labeled with each different visibility filter.Can be restricted to groups and statesIf no groups or states are selected the events will be visible for everyoneYou can determine a custom color tag that will be shown on the top right corner of the eventEach visibility filter will be displayed on the calendar and can be used for filtering events on the calendarDiscord notification webhooks can be assigned for each visibility filter. Events created, deleted or edited under this filter will then be sent over to discord.3. CategoriesCategories are displayed as a ticker infront of manually created events. Most common categories are: PvP, Stratop, Mining, CTA etc...Ticker displayed on eventCustom colors4. Discord webhookIf you want to receive notifications about your events (created/modified/deleted) on your discord you can add a webhook for the channel in discord you want to receive the notifications to. The webhooks you create will be used in the visibility filters.Adding manual eventsTo add a manual event simply go to the calendar page and press on the new event button. Fill in and select the needed information.Importing NPSI fleetsOpcalendar has the ability to import predetermined NPSI fleets directly into your calendar from public NPSI community APIs.Supported NPSI communitiesOpcalendar is currently supporting imports for the following NPSI fleets:EVE LinkNetSpectre FleetEVE University (classes)Fun Inc.FRIDAY YARRRRRedemption RoadCASFwaming DwagonsFREE RANGE CHIKUNSSetupGo to admin panel and select NPSI Event ImportsCreate a hostfor each import and fill in the needed details for them.Add a new importby pressing on the add event import buttonSelect the sourcewhere you want to fetch the fleets.Determine operation typefor each fetched fleet.Determine operation visibilityfor each fetched fleet.To schedule the import runs either add the following line in your local.py file or set up a perioduc task for theopcalendar.tasks.import_all_npsi_fleetstask on your admin menu to fetch fleets every hour.CELERYBEAT_SCHEDULE['import_all_npsi_fleets'] = { 'task': 'opcalendar.tasks.import_all_npsi_fleets', 'schedule': crontab(minute=0, hour='*'), }Importing fleets from ingame calendarYou can import events that have been created in the ingame calendar. As the fields on the ingame calendar are limited the events will not be as detailed as when created directly from the calendar.Give theadd_ingame_calendar_ownerrole for the wanter groupsNavigate to the opcalendar page and press theAdd Ingame Calendar FeedbuttonLog in with the character that holds the calendarAdd the following line into your local.py setting file or set up a periodic task for theopcalendar.tasks.update_all_ingame_eventsto pull fleets from ingame every 5 minutes.CELERYBEAT_SCHEDULE['update_all_ingame_events'] = { 'task': 'opcalendar.tasks.update_all_ingame_events', 'schedule': crontab(minute='*/5'), }Ingame event visibility and categoriesOn default the ingame events you import have no visibility filter and no category. This means theywill be visible for everyone.If you wish to add a visibility filter or a category similar to the manual events simply go to theadmin panel -> Ingame event ownersand select a filter and a category for the owner.After selecing a visibility filter and a category the ingame events will behave similar to the manual events and respect the group and state restrictions set for the visibility filters.Ical feed setup (optional)Opcalendar has the ability to generate a standard ical formated feed for pushing out events. To push out evets to the feed without login requirement requires editing the allianceauth url file as on default all pages are locked behind a login requirement.Feed setupOpen up the event visibility category and check the box to include it in the ical feed.Open the alliance authurls.pyfile that is located on default in the following path/home/allianceserver/myauth/myauth/urls.pyOn default the file looks something like this:from django.conf.urls import include, url from allianceauth import urls urlpatterns = [ url(r'', include(urls)), ] handler500 = 'allianceauth.views.Generic500Redirect' handler404 = 'allianceauth.views.Generic404Redirect' handler403 = 'allianceauth.views.Generic403Redirect' handler400 = 'allianceauth.views.Generic400Redirect'Include the following 2 lines in the file:from opcalendar.views import EventFeed #Added import for opcalendar ical feedbellow the imports on top of the fileurl(r'^opcalendar/feed.ics', EventFeed()), #Opcalendar feed urlin theurlpatternsbeforetheurl(r'', include(urls)),line Your fileurls.pyfile should now look like this:from django.conf.urls import include, url from allianceauth import urls from opcalendar.views import EventFeed #Added import for opcalendar ical feed urlpatterns = [ url(r'^opcalendar/feed.ics', EventFeed()), #Opcalendar feed url url(r'', include(urls)), ] handler500 = 'allianceauth.views.Generic500Redirect' handler404 = 'allianceauth.views.Generic404Redirect' handler403 = 'allianceauth.views.Generic403Redirect' handler400 = 'allianceauth.views.Generic400Redirect'You can now access the ical feed atauth.example.com/opcalendar/feed.icsContributingMake sure you have signed theLicense Agreementby logging in athttps://developers.eveonline.combefore submitting any pull requests. All bug fixes or features must not include extra superfluous formatting changes.
aaopto-aotf
AA OptoElectronics MPDSnCxx AOTF Driverpython driver to controlMPDSnCxx AOTF devices.InstallationTo install this package fromPyPI, invoke:pip install aaopto-aotfTo install this package from the Github in editable mode, from this directory invoke:pip install -e .To install this package in editable mode with dependencies for building the docs, invoke:pip install -e .[dev]Intro and Basic Usagefromaaopto_aotf.aotfimportMPDSaotf=MPDS("COM3")Before writing values, you must first set the global blanking mode, and each channel's frequency, mode, and whether it is driven by external input or internal (software controlled) input.fromaaopto_aotf.aotfimportMPDS,MAX_POWER_DBMfromaaopto_aotf.device_codesimportDriverMode,BlankingMode,VoltageRangeaotf.set_blanking(BlankingMode.INTERNAL)# disable blanking control from external input pin.aotf.set_external_input_voltage_range(VoltageRange.ZERO_TO_FIVE_VOLTS)# Note: device channels are 1-indexed to be consistent with the datasheet.forchannelinrange(1,aotf.num_channels+1):aotf.set_frequency(channel,110.5)aotf.set_driver_mode(DriverMode.EXTERNAL)If the driver mode is set toDriverMode.EXTERNAL, the device will take its output setpoint from the external input pin.If set toDriverMode.INTERNAL, you can control the output with software settings:forchannelinrange(1,aotf.num_channels+1):aotf.set_driver_mode(DriverMode.INTERNAL)aotf.set_power_dbm(channel,MAX_POWER_DBM)aotf.enable_channel(channel)Note that internal mode only enables a simple "On/Off" control scheme, and does not support linear scaling like external mode does vial the external analog input.At this point, you might want to save the values set above to the current profile.aotf.save_profile()# Now, calling an aotf.reset() will start with the saved settings.What's missing?Here are the minor dangling features that are not implemented.changing laser channel profiles at runtime. (These must be changed with the external input pins.)automatic sweeping modeautomatic self-sweeping is a somewhat out-of-the-ordinary feature for most users.Examples:Have a look at the examples folder to see other examples and make use of auseful calibration script.
aa-package-monitor
Package MonitorAn app for keeping track of installed packages and outstanding updates with Alliance Auth.ContentsOverviewScreenshotsInstallationUpdatingUser GuideSettingsPermissionsManagement CommandsChange LogOverviewPackage Monitor is an app for Alliance Auth that helps you keep your installation up-to-date. It shows you all installed distributions packages and will automatically notify you, when there are updates available.Features:Shows list of installed distributions packages with related Django apps (if any)Identifies new valid releases for installed packages on PyPINotifies user which installed packages are outdated and should be updatedShows the number of outdated packages as badge in the sidebarTakes into account the requirements of all installed packages and the current Python version when recommending updatesOption to add distribution pages to the monitor which are not related to Django appsOption to show all known distribution packages (as opposed to only the ones that belong to installed Django apps)Copy the respective command for a package update to your clipboard directly from the package listCan automatically notify admins when there is an update available for a currently installed packageSupported languages: English :us:, German :de: and Russian :ru:Hint: Update notifications are sent as AA notifications to all admins. We recommend usingDiscord Notifyto automatically forward those notifications to Discord as DMs.ScreenshotsInstallationStep 1 - Check PreconditionsPlease make sure you meet all preconditions before proceeding:Package Monitor is a plugin forAlliance Auth. If you don't have Alliance Auth running already, please install it first before proceeding. (see the officialAA installation guidefor details)Step 2 - Install appMake sure you are in the virtual environment (venv) of your Alliance Auth installation. Then install the newest release from PYPI:pipinstallaa-package-monitorStep 3 - Configure settingsAdd'package_monitor'toINSTALLED_APPS.Add the following lines to yourlocal.pyto enable regular checking for updates:CELERYBEAT_SCHEDULE['package_monitor_update_distributions']={'task':'package_monitor.tasks.update_distributions','schedule':crontab(minute='*/60'),}Step 4 - Finalize installationRun migrations & copy static filespythonmanage.pymigrate pythonmanage.pycollectstaticRestart your supervisor services for AuthStep 5 - Initial data loadLast, but not least perform an initial data load of all distribution packages by running the following command:pythonmanage.pypackage_monitor_refreshUpdatingpipinstall-Uaa-package-monitor pythonmanage.pycollectstatic pythonmanage.pymigrateFinally restart your AA supervisor services.User GuideThis section explains how to use the app.TerminologyTo avoid any confusion here are our definitions of some important terms:App: A Django application. An app is always part of a distribution packageDistribution packageA Python package that can be installed via pip or setuptools. Distribution packages can contain several apps.Requirement: A requirement is a condition that distribution packages can define to specify dependencies to environments or other distribution packages with specific versions. For example the distribution package django-eveuniverse can have the requirement"django-esi>=2.0", which means is requires the package django-esi in at leasts version 2.0Operation modesYou can run Package Monitor in one of two modes:Keep everything updatedKeep apps and selected distribution packages updatedKeep everything updatedIn this mode Package Monitor will monitor all installed distribution packages. In this mode you will be informed you about updates to any of your distribution packages.This is the default operation mode.Keep apps and selected distribution packages updatedWith this mode Package Monitor will monitor only those distribution packages that contain actually installed Django apps. In this mode you will be informed if there is an update to any of your apps. Note that in mode A other installed distributions packages will not be shown.To activate this mode setPACKAGE_MONITOR_SHOW_ALL_PACKAGEStoFalsein your local settings.You can also add some additional distributions to be monitored. For example you might want to add celery.See alsoSettingsfor an overview of all settings.Latest versionPackage Monitor will automatically determine a latest version for a distribution package from PyPI. Note that this can differ from the latest version shown on PyPI, because of additional considerations:First, Package Monitor will take into account all requirements of all installed distribution packages. For example if the Alliance Auth has the requirement "Django<3", then it will only show Django 2.x as latest, since Django 3.x would not fullfil the requirement set by Alliance Auth.Second, Package Monitor will in general ignore pre-releases and consider stable releases for updates only. The only exception is if the current package also is a pre release. For example you may have Black installed as beta release, therefore the app will also suggest newer beta releases.SettingsHere is a list of available settings for this app. They can be configured by adding them to your AA settings file (local.py).Note that all settings are optional and the app will use the documented default settings if they are not used.NameDescriptionDefaultPACKAGE_MONITOR_CUSTOM_REQUIREMENTSList of custom requirements that all potential updates are checked against. Example:["Sphinx<6"][]PACKAGE_MONITOR_EXCLUDE_PACKAGESNames of distribution packages to excluded from monitoring.[]PACKAGE_MONITOR_INCLUDE_PACKAGESNames of additional distribution packages to be monitored, e.g.["celery", "redis]. This setting only makes sense when you are not monitoring all packages already.[]PACKAGE_MONITOR_NOTIFICATIONS_ENABLEDWhether to notify when an update is available for a currently installed distribution package. Notifications will be sent as AA notification to all admins.FalsePACKAGE_MONITOR_SHOW_ALL_PACKAGESWhether to monitor all distribution packages, as opposed to only monitoring packages that contain Django apps.TruePACKAGE_MONITOR_SHOW_EDITABLE_PACKAGESWhether to show distribution packages installed as editable. Since version information about editable packages is often outdated, this type of packages are not shown by default.FalsePermissionsThis is an overview of all permissions used by this app. Note that all permissions are in the "general" section.NamePurposeCodeCan access this app and viewUser can access the app and also request updates to the list of distribution packagesgeneral.basic_accessManagement CommandsThe following management commands are included in this app:CommandDescriptionpackage_monitor_refreshRefreshes all data about distribution packages. This command does functionally the same as the hourly update and is helpful to use after you have completed updating outdated packages to quickly see the result of your actions on the website.
aap-client-python
OverviewThis library can be used to interface with the AAP, although it is also able to sign tokens. (for testing the verification is done correctly)Cryptographic files here shouldn’t be used in production, they’re just for testing :)UsageTo install the package, enable the virtual environment where it’s going to be used and run$ pip installaap-client-pythonTo use the Flask functionality this needs to be installed:$ pip installaap-client-python[flask]DevelopingTo prepare the environment for developing the library, create a virtual environment, go to project root and then run:$ pip install -e .[dev]TestingThe recommended way is to test using detox. This allows for testing in all the supported python versions using virtual environments effortlessly. To use, install it, then run in the project root:$ pip install detox $ detoxAlternatively, testing can be done in the same environment as the dev one by installing it’s dependecies, then running pytest:$ pip install -e .[test] $ python -m pytest -s
aapg
AAPG
aapigtf
Accelerated API Generic test FrameworkPro's Automation Framework desgined to test api's. It suite's for all type of api tests.API functionality to be testedCan also able incorporate extra functions/method to support api testing.Con'sGet cookie headers manually and copy into 'Cookie_Header' in config.ini file
aapippackage
No description available on PyPI.
aapns
AAPNSAsynchronous Apple Push Notification Service client.Requires TLS 1.2 or betterRequires Python 3.8 or betterQuickstartfromaapns.apiimportServerfromaapns.configimportPriorityfromaapns.modelsimportNotification,Alert,Localizedasyncdefsend_hello_world():client=awaitServer.production('/path/to/push/cert.pem').create_client()apns_id=awaitclient.send_notification('my-device-token',Notification(alert=Alert(body=Localized(key='Hello World!',args=['foo','bar']),),badge=42),priority=Priority.immediately)print(f'Sent push notification with ID{apns_id}')awaitclient.close()
aa-policy-validator
:white_check_mark: Access Analyzer - Batch Policy ValidatorThis script will analyze usingAWS Access Analyzer - Policy Validationall your account customer managed IAM policies.UsageProTip :bulb: : Use AWS CloudShell to run this directly on your AWS AccountInstall$ python3 -m pip install aa-policy-validator --userRun$ python3 -m aa-policy-validatorUpdate$ python3 -m pip install aa-policy-validator -U --user --no-cache-dirResultsResults will be written into/tmp/findingsfolder with aREADME.mdfile inside.
aapp2face
🏛️ AAPP2FACeAAPP2FACees una librería Python para interactuar con los servicios web de FACe, el Punto General de Entrada de Facturas de la Administración General del Estado, desde el lado de las Administraciones Públicas Españolas.Está diseñada para ser fácil de usar por desarrolladores y dispone de una interfaz de línea de comandos (CLI) que también le permite ser usada por usuarios finales.Documentación:https://antmartinez68.github.io/aapp2faceCódigo fuente:https://github.com/antmartinez68/aapp2faceRequisitosPython v3.10InstalaciónComo libreríaAunque depende de cómo estés gestionando las dependencias de tu proyecto, por lo general querrás hacer:$pipinstallaapp2faceComo aplicación de línea de comandos (CLI)Si solo pretendes usar la interfaz de línea de comandos, es recomendable instalar AAPP2FACe usandopipx:$pipxinstallaapp2faceUso básicoComo libreríaEl siguiente script de ejemplo muestra cómo puedes crear los objetos necesarios para conectar con FACe y recuperar la información de las nuevas facturas que están disponibles para su descarga:>>>fromaapp2faceimportFACeConnection,FACeSoapClient>>>cliente=FACeSoapClient(..."https://se-face-webservice.redsara.es/facturasrcf2?wsdl",..."cert.pem",..."key.pem"...)>>>face=FACeConnection(cliente)>>>nuevas_facturas=face.solicitar_nuevas_facturas()>>>forfacturainnuevas_facturas:...print(...factura.numero_registro,...factura.fecha_hora_registro,...factura.oficina_contable,...factura.organo_gestor,...factura.unidad_tramitadora,...)...Como aplicación de línea de comandos (CLI)La misma operación anterior puedes hacerla usando la CLI. Una vez tienes configurada la aplicación, basta con que ejecutes el siguiente comando:$aapp2facefacturasnuevasNúmero registro: 202001015624Fecha registro: 2023-01-19 10:57:38Oficina contable: P00000010Órgano gestor: P00000010Unidad tramitadora: P00000010Número registro: 202001017112Fecha registro: 2013-01-20 11:05:51Oficina contable: P00000010Órgano gestor: P00000010Unidad tramitadora: P000000102 nuevas facturas disponiblesConstuir AAPP2FACe desde código fuenteAAPP2FACe usaPoetrycomo gestor de dependencias y empaquetado. Si quieres construirlo desde el código fuente, puede hacerlo mediante:$gitclonehttps://github.com/antmartinez68/aapp2face $cdaapp2face $poetryinstall $poetryrunpytest $poetrybuildNota: La versión inicial de este proyecto forma parte del TFG del Grado en Ingeniería Informática enUNIRde Antonio Martínez.
aapp-runner
[![Build status](https://github.com/pytroll/pytroll-aapp-runner/workflows/CI/badge.svg?branch=main)](https://github.com/pytroll/pytroll-aapp-runner/workflows/CI/badge.svg?branch=main) [![Coverage Status](https://coveralls.io/repos/github/pytroll/pytroll-aapp-runner/badge.svg?branch=main)](https://coveralls.io/github/pytroll/pytroll-aapp-runner?branch=main)A pytroll runner supporting real-time processing of Direct Readout or regional (RARS type) AVHRR and ATOVS data from level-0 to level-1 using the NWPSAF/AAPP package.
aa-probability
No description available on PyPI.
aaps
# Herramientas AAPS-LABPaquete con todas las herramientas para la aplicacion AAPS-LAB.## InicioEstas instrucciones te proveerán una copia del proyecto en funcionamiento en tu máquina local para su uso.### PrerequisitosPara instalar y usar este paquete es necesario una instalación de python.Recomendamos usar el instalador a medida que está disponible a través del sistema AAPS-API. Este instalador instalará python, conda, el paqueteaapsy todas sus dependencias.Este paquete hace uso de las librerías geo-espaciales [GEOS](https://trac.osgeo.org/geos/) (objetos y funciones espaciales), [GDAL](https://www.gdal.org/) (formatos espaciales) y [PROJ4](https://proj4.org/) (transformación de coordenadas), por lo que estas deben estar previamente instaladas para instalar este paquete desde la fuente.Si se utilizaconda, las dependencias son instaladas de manera automática.### InstalaciónSi no se usa el instalador a medida, este paquete también está disponible en PyPI y Anaconda Cloud.Para instalar usando el gestor de paquetesconda(recomendado), puedes usar el comando` conda install-csergio.chumacero aaps `Para instalar usando el gestor de paquetespip, puedes usar el comando` pip install aaps `Puedes verificar la instalación y la versión usando el comandoconda list(conda) ypip freeze(pip).<!– ## Running the tests –><!– Explain how to run the automated tests for this system –><!– ### Break down into end to end testsExplain what these tests test and why` Give an example `–><!– ### And coding style testsExplain what these tests test and why` Give an example `–> <!– ## DeploymentAdd additional notes about how to deploy this on a live system –><!– ## Built With[Dropwizard](http://www.dropwizard.io/1.0.2/docs/) - The web framework used[Maven](https://maven.apache.org/) - Dependency Management[ROME](https://rometools.github.io/rome/) - Used to generate RSS Feeds –><!– ## ContributingPlease read [CONTRIBUTING.md](https://gist.github.com/PurpleBooth/b24679402957c63ec426) for details on our code of conduct, and the process for submitting pull requests to us. –><!– ## VersioningWe use [SemVer](http://semver.org/) for versioning. For the versions available, see the [tags on this repository](https://github.com/your/project/tags). –>## AutoresSergio Chumacero-trabajo inicial- [GitHub](https://github.com/sergio-chumacero) - contacto:[email protected]## LicenseEste proyecto se encuentra bajo la licencia MIT, que es del tipo “open source”. Véase [LICENSE.md](LICENSE.md) para ver detalles.<!– This project is licensed under the MIT License - see the –><!– ## ReconocimientosHat tip to anyone whose code was usedInspirationetc –>
aapt
AaptAndroid Asset Packaging Tool for Python3Exampleimportaapthelp=aapt.aapt('--help')print(help)ls=aapt.ls('./xx.apk')print(ls)apk_info=aapt.get_apk_info('./xxx.apk')print(apk_info)# save iconfromPILimportImageapk_info=aapt.get_apk_and_icon('./xxx.apk')byte_stream=io.BytesIO(apk_info['icon_byte_value'])img=Image.open(byte_stream)img.save('./1.png')# upload filerequests.post(url,files={'file':apk_info['icon_byte_value']})APIaapt(args)ls(file_path)dump(file_path, values)packagecmd(file_path, command)remove(file_path, files)add(file_path, files)crunch(resource, output_folder)single_crunch(input_file, output_file):version()get_apk_info(file_path)
aapt2
No description available on PyPI.
aa-pubsub
关于Pubsubpubsub的最初目的是通过redis服务转发单目摄像头数据,供深度学习和ROS两方同时订阅实际使用过程中增加了转发FairMOT识别结果的用途,同时pubsub在主-从两台机器上进行部署在0.1.4版本中,增加了对主-从和多话题的支持,修改了配置文件的配置方式在0.1.5版本中,完善了setup.py,并发布到pypi在0.1.6版本中,增加了configs模块,可通过此模块直接获得配置信息在0.1.7版本中,修复了全局变量和单例引入的订阅”串台”问题1. 问题1.1.redis.exceptions.ConnectionError: Connection closed by server.https://github.com/andymccurdy/redis-py/issues/1140使用try... except redis.exceptions.ConnectionError:的方法规避了此问题FYI - 在本项目下,使用`py37`没有出现此问题 - 在本项目下,使用`python`没有出现此问题 - 在`FairMOT`下出现此问题,初步怀疑是`timeout`问题该问题通过安装hiredis的方式进行了修复,修复版本0.1.5,暂未出现此问题1.1.1 原因2scheduled to be closed ASAP for overcoming of output buffer limits查看/var/log/redis/redis-server.log,可以发现3105:M 23 Nov 18:06:12.428 # Client id=105 addr=<sub-ip>:33750 fd=7 name= age=2 idle=0 flags=N db=3 sub=1 psub=0 multi=-1 qbuf=0 qbuf-free=0 obl=0 oll=38 omem=35102272 events=rw cmd=subscribe scheduled to be closed ASAP for overcoming of output buffer limits.所有的client请求redis数据的时候,redis要返回给client的数据都会先被存储在output-buffer中,等所有信息都被传送完毕之后,再清除output-buffer中的数据。为了防止output-buffer过大,redis进行了限制。在配置文件/etc/redis/redis.conf中进行修改client-output-buffer-limit normal 0 0 0 client-output-buffer-limit slave 256mb 64mb 60 client-output-buffer-limit pubsub 32mb 8mb 60客户端种类,包括Normal,Slaves和Pub/SubNormal: 普通的客户端。默认limit 是0,也就是不限制Pub/Sub: 发布与订阅的客户端的。默认hard limit 32M,soft limit 8M/60sSlaves: 从库的复制客户端。默认hard limit 256M,soft limit 64M/60shard limit: 缓冲区大小的硬性限制soft limit: 缓冲去大小的软性限制soft seconds: 缓冲区大小达到了(超过)soft limit值的持续时间client buffer的大小达到了soft limit并持续了soft seconds时间,将立即断开和客户端的连接client buffer的大小达到了hard limit,server也会立即断开和客户端的连接如果难以准确估计缓冲区需要的大小和持续时间,可以直接设为0,如下client-output-buffer-limit pubsub 0 0 0修改后重启redis服务。1.2. 同一程序中有多个生产者/发送端发送多个topic,消费者/订阅端会“串台“原因分析Subscriber中错误地将消息数据设置为全局变量单例模式未能区分不同单例的name状态:已修复,不影响调用方式修复版本: 0.1.72. 依赖本工具依赖redis服务请先安装redis-serverapt install redis-server为Python安装依赖pip3.7 install opencv-python numpy pyyaml hiredis redis pip2.7 install opencv-python numpy pyyaml hiredis redis补充说明setup.py文件本身指明了相关依赖,但是由于公司网络原因,可能会出现无法正常下载安装的情况,建议在安装之前手动完成依赖包的安装。3. 作为程序部署step1. 在~路径下新建路径robot\configs,注意:请勿使用其他路径step2. 将项目目录下的配置文件config\pubsub.yaml拷贝到~\robot\configs路径下,注意:请勿修改文件名称step3. 按实际情况修改~\robot\configs\pubsub.yaml文件# camera data pub & sub config mono: type: "pub" source: "0" host: "172.172.0.10" port: "6379" db: 3 name: "robot_mono" topic: "robot_mono" # bbox from FairMOT pub & sub config track: type: "pub" source: "0" host: "172.172.0.11" port: "6379" db: 3 name: "robot_track" topic: "robot_track"参数说明:mono: 指定为单目相机,该名称请勿修改 type: 发布消息,请勿修改 source: camera device id, 通过 ls /dev/video* 配合 v4l2-ctl -d /dev/video0 --all查看 host: 消息发布ip,对应部署了pubsub程序的服务器的IP地址 port: 消息发布端口,对应部署了pubsub程序的服务器的Redis服务的端口 db: 不需改动 name: 消息发布服务的名称 topic: 消息发布话题的名称 track: 指定为跟踪算法,该名称请勿修改 type: 发布消息,请勿修改 source: 请勿修改 host: 消息发布ip,对应部署了tracking算法的服务器的IP地址 port: 消息发布端口,对应部署了tracking算法的服务器的Redis服务的端口 db: 不需改动 name: 消息发布服务的名称 topic: 消息发布话题的名称step5. 在~/.bashrc中声明configs地址执行下面的命令,将configs的路径加入到系统环境echo export ROBOT_CONFIGS=$(dirname ~/.)'/robot/configs' >> ~/.bashrc && source ~/.bashrc请务必注意将配置文件放在上述指定文件夹下setp6. 如果存在主-从部署,请在从机上重复上述过程请注意:模块和程序使用同样的配置文件!!!3.1. 开机自启动[未测试]在/etc/rc.local脚本中增加了开机自启动代码,代码如下export PUBLISHERPATH=$(dirname ~/.)'/robot/pubsub' source $(dirname $(which conda))/activate pubsub cd $PUBLISHERPATH python app.py第一段申明了MONOPUB的路径第二段进入到该路径第三段在后台启动脚本publish.sh并输出日志到pub.log第三段回到home可使用ps -ef | grep python查看4. 作为模块使用分别在python2.7和python3.7版本下进行测试。4.1. 安装pip install aa-pubsub4.2. 配置如果主-从部署在同一IP下,可以不必另外配置如果主-从部署在不同IP下,从机需要依照主机的配置方法自行配置请注意:模块和程序使用同样的配置文件!!!5. 性能5.1. 本地Docker间传输目前Publisher端读取摄像头数据到Subscriber端得到数据拥有约100ms的延时,其中有20秒是docker造成的延时。5.2. 同网段局域网间传输目前Publisher端读取摄像头数据到Subscriber端得到数据拥有约100ms的延时,同网段指办公网段(10.1.x.x)5.3. 不同网段局域网间传输目前Publisher端读取摄像头数据到Subscriber端得到数据拥有约100ms-200ms的延时,不同网段指办公网段与GPU服务器(10.1.x.x与10.0.40.x)6. 开发/贡献代码6.1. 测试用例测试用例的统一运行入口是test.py文件python test.py各测试用例请统一在testcases路径下实现,以test_{name}.py的形式命名py文件中,测试类以{name}Test的形式命名更多请参考unittest6.2. 提交pip安装提交需要的工具pip install twine更新version.py的版本号提交python setup.py sdist twine upload dist/*7. 当前通过pubsub支持的消息# 单目相机 mono: type: "pub" source: "0" host: "127.0.0.1" port: "6379" db: 3 name: "robot_mono" topic: "robot_mono" # FairMOT得到的bbox track: type: "pub" source: "0" host: "127.0.0.1" port: "6379" db: 3 name: "robot_track" topic: "robot_track" # 跟随目标识别开/关量 switch_send: type: "pub" source: "0" host: "10.1.101.179" port: "6379" db: 3 name: "robot_switch_send" topic: "robot_switch_send" # 跟随目标识别开/关执行反馈量 switch_feedback: type: "pub" source: "0" host: "10.1.101.179" port: "6379" db: 3 name: "robot_switch_feedback" topic: "robot_switch_feedback" # 跟随目标切换信号量 control_singal_load_track_feature: type: "pub" source: "0" host: "10.1.101.179" port: "6379" db: 3 name: "control_singal_load_track_feature" topic: "control_singal_load_track_feature" # 匹配到的跟随目标的图片地址 target_matched: type: "pub" source: "0" host: "10.1.101.141" port: "6380" db: 3 name: "target_matched" topic: "target_matched" # 跌倒开关 fall_down_switch: type: "pub" source: "0" host: "10.1.157.165" port: "6379" db: 3 name: "fall_down_switch" topic: "fall_down_switch" # 跌倒开关的反馈 fall_down_switch_result: type: "sub" source: "0" host: "10.1.157.165" port: "6379" db: 3 name: "fall_down_switch_result" topic: "fall_down_switch_result" # 目标人员跌倒信息 target_status: type: "sub" source: "0" host: "10.1.157.165" port: "6379" db: 3 name: "target_status" topic: "target_status"
aapy
Python code to retrieve data from theEPICS Archiver Appliance.UsageNote on timezonesWhen you pass a datetime to aapy it doesn't know by default what timezone that datetime is supposed to be in. It will assume that it is the local timezone, but will print a warning. If you pass it a timezone-aware datetime no warning will be printed. You can useutc_datetime()as a shortcut:>>> from aa.utils import utc_datetime >>> utc_datetime(2019, 10, 7, 17) # 5pm UTC on 7th October 2019Fetching dataTo retrieve data, create the appropriate fetcher>>> from aa.js import JsonFetcher >>> jf = JsonFetcher('archappl.diamond.ac.uk', 80)You can request a single event, returning an ArchiveEvent object:>>> from datetime import datetime >>> event = jf.get_event_at('SR-DI-DCCT-01:SIGNAL', datetime.now()) WARNING:root:Assuming timezone for 2019-10-07 16:42:13.301672 is Europe/London Archive event for PV SR-DI-DCCT-01:SIGNAL: timestamp 2019-10-07 15:42:04.876639 UTC value [301.33007915] severity 0 >>> event.value array([300.77982715]) >>> event.utc_datetime datetime.datetime(2019, 10, 7, 16, 2, 54, 928836, tzinfo=<UTC>)You can also request a range of events, returning an ArchiveData object:>>> data = jf.get_values('SR-DI-DCCT-01:SIGNAL', utc_datetime(2018, 1, 7), utc_datetime(2018, 1, 8)) >>> data.values array([[2.51189843e-03], [1.56371643e-03], [5.54392030e-04], ..., [2.77373366e+02], [2.77329542e+02], [2.77287664e+02]]) >>> data.utc_datetimes array([datetime.datetime(2018, 1, 6, 23, 59, 59, 3897, tzinfo=<UTC>), datetime.datetime(2018, 1, 7, 0, 0, 2, 3975, tzinfo=<UTC>), datetime.datetime(2018, 1, 7, 0, 0, 5, 4066, tzinfo=<UTC>), ..., datetime.datetime(2018, 1, 7, 23, 59, 53, 3885, tzinfo=<UTC>), datetime.datetime(2018, 1, 7, 23, 59, 56, 3825, tzinfo=<UTC>), datetime.datetime(2018, 1, 7, 23, 59, 59, 3726, tzinfo=<UTC>)], dtype=object) >>> len(data) 28764Developmentaapy uses Pipenv to manage its dependencies.To install development requirements:pipenv install --devTo run the tests and static checks:pipenv run tests
a.arabaci
Failed to fetch description. HTTP Status Code: 404
aaransia
3aransiaTransliteration of languages and dialectsContributionFor contribution you can refer toCONTRIBUTING.mdFeaturesFast and reliable - it uses default variables to access dataBulk transliterationAPI availableMultilanguage transliteration available70 languages and dialects supporttedLanguages and dialects supported1. Afrikaans 2. Algerian 3. Arabic 4. Azerbaijani 5. Bosnian 6. Catalan 7. Corsican 8. Czech 9. Welsh 10. Danish 11. German 12. Greek 13. English 14. Esperanto 15. Spanish 16. Estonian 17. Basque 18. Persian 19. Finnish 20. French 21. Frisian 22. Irish 23. Gaelic 24. Galician 25. Hausa 26. Croatian 27. Creole 28. Hungarian 29. Hawaiian 30. Indonesian 31. Igbo 32. Icelandic 33. Italian 34. Kinyarwanda 35. Kurdish 36. Latin 37. Libyan 38. Lithuanian 39. Luxembourgish 40. Latvian 41. Moroccan 42. Malagasy 43. Maori 44. Malay 45. Maltese 46. Dutch 47. Norwegian 48. Polish 49. Portuguese 50. Romanian 51. Samoan 52. Shona 53. Slovak 54. Slovenian 55. Somali 56. Albanian 57. Sesotho 58. Sundanese 59. Swedish 60. Swahili 61. Filipino 62. Tunisian 63. Turkish 64. Turkmen 65. Urdu 66. Uzbek 67. Vietnamese 68. Xhosa 69. Yoruba 70. ZuluInstallationpip install aaransiaUsageTransliterate from a language or dialect to anotherARABIC_SENTENCE="كتب بلعربيا هنايا شحال ما بغيتي"print(transliterate(ARABIC_SENTENCE,source='ar',target='ma'))>>> ktb bl3rbya hnaya ch7al ma bghitiTransliterate cross languages and dialects to another, using the universal parameterfromaaransiaimportSourceLanguageErrorMOROCCAN_ARABIC_SENTENCE="ktb بلعربيا hnaya شحال ما بغيتي"try:print(transliterate(MOROCCAN_ARABIC_SENTENCE,source='ar',target='ma'))exceptSourceLanguageErrorassource_language_error:print(source_language_error)print(transliterate(MOROCCAN_ARABIC_SENTENCE,source='ar',target='ma',universal=True))print(transliterate(MOROCCAN_ARABIC_SENTENCE,source='ma',target='ar',universal=True))>>> Source alphabet language doesn't match the input text: ar >>> ktb bl3rbya hnaya chhal ma bghyty >>> كتب بلعربيا هنايا شحال ما بغيتيGet all alphabets codesfromaaransiaimportget_alphabets_codesprint(len(get_alphabets_codes()))print(get_alphabets_codes())>>> 70 >>> ['ar', 'af', 'sq', 'al', 'az', 'eu', 'bo', 'ca', 'co', 'hr', 'cs', 'da', 'nl', 'en', 'eo', 'et', 'tl', 'fi', 'fr', 'fs', 'gl', 'de', 'ht', 'ha', 'hw', 'hu', 'is', 'ig', 'id', 'ga', 'it', 'ki', 'ku', 'la', 'lv', 'li', 'lt', 'lu', 'ma', 'mg', 'ms', 'mt', 'mo', 'no', 'pl', 'pt', 'ro', 'sa', 'gc', 'el', 'ss', 'sh', 'sk', 'sl', 'so', 'es', 'su', 'sw', 'sv', 'tn', 'tr', 'tu', 'uz', 'vi', 'cy', 'xh', 'yo', 'zu', 'fa', 'ur']Get all alphabetsfromaaransiaimportget_alphabetsprint(get_alphabets())>>> { >>> 'af': 'Afrikaans Alphabet', >>> 'al': 'Algerian Alphabet', >>> 'ar': 'Arabic Alphabet', >>> 'az': 'Azerbaijani Alphabet', >>> 'bo': 'Bosnian Alphabet', >>> 'ca': 'Catalan Alphabet', >>> 'co': 'Corsican Alphabet', >>> 'cs': 'Czech Alphabet', >>> 'cy': 'Welsh Alphabet', >>> 'da': 'Danish Alphabet', >>> 'de': 'German Alphabet', >>> 'el': 'Greek Alphabet', >>> 'en': 'English Alphabet', >>> 'eo': 'Esperanto Alphabet', >>> 'es': 'Spanish Alphabet', >>> 'et': 'Estonian Alphabet', >>> 'eu': 'Basque Alphabet', >>> 'fa': 'Persian Alphabet', >>> 'fi': 'Finnish Alphabet', >>> 'fr': 'French Alphabet', >>> 'fs': 'Frisian Alphabet', >>> 'ga': 'Irish Alphabet', >>> 'gc': 'Gaelic Alphabet', >>> 'gl': 'Galician Alphabet', >>> 'ha': 'Hausa Alphabet', >>> 'hr': 'Croatian Alphabet', >>> 'ht': 'Creole Alphabet', >>> 'hu': 'Hungarian Alphabet', >>> 'hw': 'Hawaiian Alphabet', >>> 'id': 'Indonesian Alphabet', >>> 'ig': 'Igbo Alphabet', >>> 'is': 'Icelandic Alphabet', >>> 'it': 'Italian Alphabet', >>> 'ki': 'Kinyarwanda Alphabet', >>> 'ku': 'Kurdish Alphabet', >>> 'la': 'Latin Alphabet', >>> 'li': 'Libyan Alphabet', >>> 'lt': 'Lithuanian Alphabet', >>> 'lu': 'Luxembourgish Alphabet', >>> 'lv': 'Latvian Alphabet', >>> 'ma': 'Moroccan Alphabet', >>> 'mg': 'Malagasy Alphabet', >>> 'mo': 'Maori Alphabet', >>> 'ms': 'Malay Alphabet', >>> 'mt': 'Maltese Alphabet', >>> 'nl': 'Dutch Alphabet', >>> 'no': 'Norwegian Alphabet', >>> 'pl': 'Polish Alphabet', >>> 'pt': 'Portuguese Alphabet', >>> 'ro': 'Romanian Alphabet', >>> 'sa': 'Samoan Alphabet', >>> 'sh': 'Shona Alphabet', >>> 'sk': 'Slovak Alphabet', >>> 'sl': 'Slovenian Alphabet', >>> 'so': 'Somali Alphabet', >>> 'sq': 'Albanian Alphabet', >>> 'ss': 'Sesotho Alphabet', >>> 'su': 'Sundanese Alphabet', >>> 'sv': 'Swedish Alphabet', >>> 'sw': 'Swahili Alphabet', >>> 'tl': 'Filipino Alphabet', >>> 'tn': 'Tunisian Alphabet', >>> 'tr': 'Turkish Alphabet', >>> 'tu': 'Turkmen Alphabet', >>> 'ur': 'Urdu Alphabet', >>> 'uz': 'Uzbek Alphabet', >>> 'vi': 'Vietnamese Alphabet', >>> 'xh': 'Xhosa Alphabet', >>> 'yo': 'Yoruba Alphabet', >>> 'zu': 'Zulu Alphabet' >>> }Adding a language or a dialectAdd it to thealphabetCSV fileGenerate the whole alphabet with theconstruct_alphabetfunction fromdata.pyUpdate thedefaults.py(the order the to be respected)Add the alphabet codeAdd the alphabet nameAdd both of them to the alphabet dictionaryAdd the double letters if there are anyTest a text with the language just added against all other languages intest.pyAdd a language text to test intext_samples(the order is to be respected)Add test handling for the new languageTest it by using the commandpython -m unittest discover -s aaransiafrom the3aransiarepositoryFix the bugsValidate it semantically and phoneticallyMake a pull requestWait for the PR confirmation and add your name to the collaboratorsFixing bugs and adding featurespylintcode before doing a PRContribution can also be made through adding issuesOther related projects3aransia.apiThe api of 3aransia3aransia.webThe web application of 3aransia
aa-ravworks-exporter
AllianceAuth Ravworks ExporterThis is a plugin forAllianceAuththat exports ESI data to a config file forRavworks.PrerequisitesThis plugin requires a working AllianceAuth installation with v3 as a minimum version. See theAllianceAuth installation instructionsfor more information.Data from ESI are not directly fetched by this application, instead it relies on other AllianceAuth plugins. Currently the following plugins are supported:Skills:MemberAuditCorpToolsStructures:aa-structuresIf there are multiple plugins for the same functionality, only 1 is needed. If no plugin is installed, that functionality will be unavailable.InstallationInstall the package with pip:pipinstallaa-ravworks-exporterAdd'ravworks_exporter',to yourINSTALLED_APPSinlocal.pyBasic usageDownload the config file from Ravworks website. This apps doesn't directly create the config files but rather update existing ones with data from ESI.Use it in the form in this plugin page on AllianceAuth.Download the update config file and use it on Ravworks website.
aarc-entitlement
This package provides python classes to create, parse and compare entitlements according to the AARC recommendations G002 and G069.InstallationInstall using pip:pip install aarc-entitlementDocumentationThe documentation is available athttps://aarcentitlement.readthedocs.io.The G002 recommendation can be found athttps://aarc-community.org/guidelines/aarc-g002.ExamplesCheck if a user entitlement permits usage of a serviceimportaarc_entitlement# This entitlement is needed to use a servicerequired=aarc_entitlement.G002("urn:geant:h-df.de:group:aai-admin")# This entitlement is held by a user who wants to use the serviceactual=aarc_entitlement.G002("urn:geant:h-df.de:group:aai-admin:role=member")# Is the user permitted to use the service, because of its entitlement `actual`?permitted=actual.satisfies(required)# -> True here# Are the two entitlements the same?equals=required==actual# -> False hereG069 Entitlement NormalizationStarting with recommendation G069 the specification requires normalization of entitlements. When usingAarcEntitlementG069the library produces normalized representations.importaarc_entitlementnot_normalized="UrN:NiD:ExAmPlE.oRg:group:Minun%20Ryhm%c3%a4ni"normalized=repr(aarc_entitlement.G069(not_normalized))# -> "urn:nid:example.org:group:Minun%20Ryhm%C3%A4ni"Tests, Linting and DocumentationRun tests for all supported python versions:# run tests, coverage and linter tox # build docs tox -e docs # After this, the documentation should be located at `doc/build/index.html`.PackagingTo upload a new package version to pypi use the Makefile:# build the package make dist # upload the package to pypi make uploadFunding NoticeThe AARC project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 653965 and 730941.
aarc-g002-entitlement
AARC G002 Entitlement ParserDeprecation warningThe newerAarcEntitlementsupports AARC-G002 and the updated version AARC-G069.IntroductionThis package provides a python Class to parse and compare entitlements according to the AARC-G002 Recommendationhttps://aarc-community.org/guidelines/aarc-g002.Examplefromaarc_g002_entitlementimportAarc_g002_entitlementrequired=Aarc_g002_entitlement('urn:geant:h-df.de:group:aai-admin',strict=False)actual=Aarc_g002_entitlement('urn:geant:h-df.de:group:aai-admin:role=member#backupserver.used.for.developmt.de')# is a user with actual permitted to use a resource which needs required?permitted=required.is_contained_in(actual)# True in this case# are the two entitlements the same?equals=required==actual# False in this caseFor more examples:./example.pyInstallationpip --user install aarc-g002-entitlementDocumentationtox -e docsAfter this, the documentation should be located atdoc/build/index.html.Documentation is also available atReadthedocsTestsRun tests for all supported python versionstoxFunding NoticeThe AARC project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 653965 and 730941.
aarchimate
Something quick and dirty to help write AArch64 code.The name is a pun on Archimate.
aarddict
UNKNOWN
aardev-money
No description available on PyPI.
aar-doc
aar-doc - Automated Ansible Role Documentationaar-docis a tool for generating documentation automatically from an Ansible role's metadata. Specifically, it reads themeta/main.ymlandmeta/argument_specs.ymlfiles.This is heavily inspired byterraform-docswhich does a similar thing with Terraform modules.aar-docisn't nearly as featureful though, but should do the trick!For instance, the only output format supported is Markdown. As withterraform-docs, you are able to override the default template however. As Ansible users are familiar withJinja2aar-docuses it for templating.Contributions are welcome to add support for more output formats!InstallationAsaar-docis a Python utility, the usualpip installworks, but the tool isn't (yet) published in Pypi. So you'll need to pointpipat the repository itself:pipinstallgit+https://gitlab.com/kankare/aar-docUsageUsage:aar-doc[OPTIONS]ROLE_PATHCOMMAND[ARGS]...AtoolforgeneratingdocsforAnsibleroles. Arguments:ROLE_PATHPathtoanAnsiblerole[required]Options:--config-fileFILE[default:.aar-doc.yml]--output-fileFILE[default:README.md]--output-templateTEXTOutputtemplateasastringorapathtoafile.[default:<!--BEGIN_ANSIBLE_DOCS-->{{content}}<!--END_ANSIBLE_DOCS-->]--output-mode[inject|replace][default:inject]--install-completion[bash|zsh|fish|powershell|pwsh]Installcompletionforthespecifiedshell.--show-completion[bash|zsh|fish|powershell|pwsh]Showcompletionforthespecifiedshell,tocopyitorcustomizetheinstallation.--helpShowthismessageandexit. Commands:markdownCommandforgeneratingroledocumentationinMarkdownformat.ModesTheinjectmode will inject only the changed content in between theBEGIN_ANSIBLE_DOCSandEND_ANSIBLE_DOCSmarkers. This makes it possible to have header and footer text in the file that is not touched. This is the default mode, and will revert toreplaceif the file does not exist to create it the first time.Thereplacemode will replace the whole file with the template. Usually, theinjectmode should be fine for regular usage and changing the mode is not necessary unless you want to overwrite an existing file.ConfigurationThe configuration options can be provided either via CLI arguments shown in--help, or a--config-filein YAML format. Note that the options have underscores when provided via the configuration file.Examples:aar-doc--output-fileROLE.md--output-modereplace...---output_file:ROLE.mdoutput_mode:replaceTemplatingYou can override the--output-templateused for rendering the document. This may be passed in as a string containing Jinja2, or a path to a file. As noted above, this option may be passed in via CLI or the configuration file.In the configuration file, easiest is to do a multiline string:---output_template:|<!-- BEGIN_ANSIBLE_DOCS -->This is my role: {{ role }}<!-- END_ANSIBLE_DOCS -->As noted above, the templatemuststart and end with the markers as comments, for the content injection to work.{{ content }}will contain the rendered builtin output specific template content. For Markdown, seetemplates/markdown.j2.You will most likely want to skip using it in your own template however, and use the provided variables containing therole metadataandargument specsdirectly.aar-docdoes not manipulate the values coming in from the YAML files in any way.Template variables:role: The role namecontent: Whole content as pre-rendered by the built in templatesmetadata: Metadata read frommeta/main.ymlargument_specs: Metadata read frommeta/argument_specs.ymlExample:<!-- BEGIN_ANSIBLE_DOCS --> This is my role: {{ role }} <!-- 'metadata' contains all the things in meta/main.yml --> {{ metadata.galaxy_info.license }} <!-- All the usual Jinja2 filters are available --> {{ metadata.galaxy_info.galaxy_tags | sort }} <!-- Including files is also possible in relation to the role's directory with Jinja2's include directive --> {% include "defaults/main.yml" %} <!-- 'argument_specs' contains all the things in meta/argument_specs.yml --> {% for entrypoint, specs in argument_specs | items %} Task file name: {{ entrypoint }}.yml has {{ specs | length }} input variables! {% endfor %} <!-- END_ANSIBLE_DOCS -->aar-doc--output-template./path/to/template.j2...More examples can be found in thetests.LicenseMITAknowledgementsKudos to the original authorMiika Kankare!Kudos toKevin P. Flemingfor his additions to the original project!
aardtools
UNKNOWN
aardvark
AardvarkAardvark is a multi-account AWS IAM Access Advisor API (and caching layer).Install:Ensure that you have Python 3.6 or later. Python 2 is no longer supported.gitclonehttps://github.com/Netflix-Skunkworks/aardvark.gitcdaardvark python3-mvenvenv .env/bin/activate pythonsetup.pydevelopKnown Dependencieslibpq-devConfigure AardvarkThe Aardvark config wizard will guide you through the setup.% aardvark config Aardvark can use SWAG to look up accounts. https://github.com/Netflix-Skunkworks/swag-client Do you use SWAG to track accounts? [yN]: no ROLENAME: Aardvark DATABASE [sqlite:////home/github/aardvark/aardvark.db]: # Threads [5]: >> Writing to config.pyWhether to useSWAGto enumerate your AWS accounts. (Optional, but useful when you have many accounts.)The name of the IAM Role to assume into in each account.The Database connection string. (Defaults to sqlite in the current working directory. Use RDS Postgres for production.)Create the DB tablesaardvark create_dbIAM Permissions:Aardvark needs an IAM Role in each account that will be queried. Additionally, Aardvark needs to be launched with a role or user which cansts:AssumeRoleinto the different account roles.AardvarkInstanceProfile:Only create one.Needs the ability to callsts:AssumeRoleinto all of the AardvarkRole'sAardvarkRole:Must exist in every account to be monitored.Must have a trust policy allowingAardvarkInstanceProfile.Has these permissions:iam:GenerateServiceLastAccessedDetails iam:GetServiceLastAccessedDetails iam:listrolepolicies iam:listroles iam:ListUsers iam:ListPolicies iam:ListGroupsSo if you are monitoringnaccounts, you will always needn+1roles. (nAardvarkRoles and1AardvarkInstanceProfile).Note: For locally running aardvark, you don't have to take care of the AardvarkInstanceProfile. Instead, just attach a policy which contains "sts:AssumeRole" to the user you are using on the AWS CLI to assume Aardvark Role. Also, the same user should be mentioned in the trust policy of Aardvark Role for proper assignment of the privileges.Gather Access Advisor DataYou'll likely want to refresh the Access Advisor data regularly. We recommend running theupdatecommand about once a day. Cron works great for this.Without SWAG:If you don't have SWAG you can pass comma separated account numbers:aardvark update -a 123456789012,210987654321With SWAG:Aardvark can useSWAGto look up accounts, so you can run against all with:aardvark updateor by account name/tag with:aardvark update -a dev,test,prodAPIStart the APIaardvark start_api -b 0.0.0.0:5000In production, you'll likely want to have something like supervisor starting the API for you.Use the APISwagger is available for the API at<Aardvark_Host>/apidocs/#!.Aardvark responds to get/post requests. All results are paginated and pagination can be controlled by passingcountand/orpagearguments. Here are a few example queries:curllocalhost:5000/api/1/advisors curllocalhost:5000/api/1/advisors?phrase=SecurityMonkey curllocalhost:5000/api/1/advisors?arn=arn:aws:iam::000000000000:role/SecurityMonkey&arn=arn:aws:iam::111111111111:role/SecurityMonkey curllocalhost:5000/api/1/advisors?regex=^.*Monkey$DockerAardvark can also be deployed with Docker and Docker Compose. The Aardvark services are built on a shared container. You will need Docker and Docker Compose installed for this to work.To configure the containers for your set of accounts create a.envfile in the root of this directory. Define the environment variables within this file. This example uses AWS Access Keys. We recommend using instance roles in production.AARDVARK_ROLE=Aardvark AARDVARK_ACCOUNTS=<account id> AWS_DEFAULT_REGION=<aws region> AWS_ACCESS_KEY_ID=<your access key> AWS_SECRET_ACCESS_KEY=<you secret key>NameServiceDescriptionAARDVARK_ROLEcollectorThe name of the role for Aardvark to assume so that it can collect the data.AARDVARK_ACCOUNTScollectorOptional if using SWAG, otherwise required. Set this to a list of SWAG account name tags or a list of AWS account numbers from which to collect Access Advisor records.AWS_ARN_PARTITIONcollectorRequired if not using an AWS Commercial region. For example,aws-us-gov. By default, this isaws.AWS_DEFAULT_REGIONcollectorRequired if not running on an EC2 instance with an appropriate Instance Profile. Set these to the credentials of an AWS IAM user with permission tosts:AssumeRoleto the Aardvark audit role.AWS_ACCESS_KEY_IDcollectorRequired if not running on an EC2 instance with an appropriate Instance Profile. Set these to the credentials of an AWS IAM user with permission tosts:AssumeRoleto the Aardvark audit role.AWS_SECRET_ACCESS_KEYcollectorRequired if not running on an EC2 instance with an appropriate Instance Profile. Set these to the credentials of an AWS IAM user with permission tosts:AssumeRoleto the Aardvark audit role.AARDVARK_DATABASE_URIcollectorandapiserverSpecify a custom database URI supported by SQL Alchemy. By default, this will use theAARDVARK_DATA_DIRvalue to create a SQLLite Database. Example:sqlite:///$AARDVARK_DATA_DIR/aardvark.dbOnce this file is created, then build the containers and start the services. Aardvark consists of three services:Init - The init container creates the database within the storage volume.API Server - This is the HTTP webserver will serve the data. By default, this is listening onhttp://localhost:5000/apidocs/#!.Collector - This is a daemon that will fetch and cache the data in the local SQL database. This should be run periodically.# build the containersdocker-composebuild# start up the containersdocker-composeupFinally, to clean up the environment# bring down the containersdocker-composedown# remove the containersdocker-compoesrmNotesThreadsAardvark will launch the number of threads specified in the configuration. Each of these threads will retrieve Access Advisor data for an account and then persist the data.DatabaseTheregexquery is only supported in Postgres (natively) and SQLite (via some magic courtesy of Xion in thesqla_regexfile).TLSWe recommend enabling TLS for any service. Instructions for setting up TLS are out of scope for this document.SignalsNew in v0.3.1Aardvark usesBlinkerfor signals in its update process. These signals can be used for things like emitting metrics, additional logging, or taking more actions on accounts. You can use them by writing a script that defines your handlers and callsaardvark.manage.main(). For example, create a file calledsignals_example.pywith the following contents:importloggingfromaardvark.manageimportmainfromaardvark.updaterimportAccountToUpdatelogger=logging.getLogger('aardvark_signals')@AccountToUpdate.on_ready.connectdefhandle_on_ready(sender):logger.info(f"got on_ready from{sender}")@AccountToUpdate.on_complete.connectdefhandle_on_complete(sender):logger.info(f"got on_complete from{sender}")if__name__=="__main__":main()This file can now be invoked in the same way asmanage.py:pythonsignals_example.pyupdate-acool_accountThe log output will be similar to the following:INFO: getting bucket swag-bucket INFO: Thread #1 updating account 123456789012 with all arns INFO: got on_ready from <aardvark.updater.AccountToUpdate object at 0x10c379b50> INFO: got on_complete from <aardvark.updater.AccountToUpdate object at 0x10c379b50> INFO: Thread #1 persisting data for account 123456789012 INFO: Thread #1 FINISHED persisting data for account 123456789012Available signalsClassSignalsmanage.UpdateAccountThreadon_ready,on_complete,on_failureupdater.AccountToUpdateon_ready,on_complete,on_error,on_failureTODO:SeeTODO
aardvark-py
Please refer to LICENSE.txt and README.txt in the package.
aardwolf
No description available on PyPI.
aardwolfgui
No description available on PyPI.
aa-relays
Failed to fetch description. HTTP Status Code: 404
aarghparse
MotivationI was having shower and this name came up.At the moment this is a collection ofargparseextensions I have written and found useful in recent times.Features@arg_converterdecorator to write simple argument value parsers without theargparse.Actionboilerplate@subcommanddecorator to save you from all theadd_subparsersandset_defaults(func=).@clidecorator to generate a command-line interface.ExampleThe example below combines all the features, but the tool doesn’t enforce it on you.If you have an existingargparse.ArgumentParserdefinition, you should be able to replace it withaarghparseby just changing the initialisation line toparser =aarghparse.ArgumentParser(...).importdatetimeasdtimportdateutil.tzfromaarghparseimportArgumentParser,arg_converter,cli@clidefcalendar_cli(parser:ArgumentParser,subcommand:ArgumentParser.subcommand):""" Command-line calendar. """parser.add_argument('--date-format',default=None,)@arg_converterdeftz_arg(value):returndateutil.tz.gettz(value)@subcommand(name="now",args=[["--tz",{"action":tz_arg,"help":"Timezone",}],],)defnow_cmd(args):""" Prints today's date. """date_format=args.date_formator"%Y-%m-%d%H:%M:%S"print(dt.datetime.now(tz=args.tz).strftime(date_format))if__name__=="__main__":calendar_cli.run()If you installpython-dateutilthen you can try the above with:python-maarghparse.examples.calendar--helppython-maarghparse.examples.calendarnow--helppython-maarghparse.examples.calendarnow--tz"Europe/Riga"python-maarghparse.examples.calendar--date-format"%d.%m.%Y."now--tz"Europe/Riga"
aarish-api
A package that allows you to use the state of the art pretrained FaceNet model for calculating distance between two faces with just one function call literally !
aark-sdk
No description available on PyPI.
aaron
IntroductionInstallpipinstallaaronHow to useYou probably don’t want to (yet)
aaronai
No description available on PyPI.
aaronsteers
No description available on PyPI.
aa-routing
Routing for Alliance AuthRouting is a pathfinding plugin forAlliance Auth.Featuresdef route_path(source: int, destination: int, mode="p_shortest", algorithm="astar", edges: list = [], static_cache: bool = False) -> List[int]:def route_length(source: int, destination: int, mode="p_shortest", algorithm="astar", edges: List = [], static_cache: bool = False ) -> int:def systems_range(source: int, range: int, mode="p_shortest", edges: list = [], static_cache: bool = False) -> List:A pregenerated optional Graph dict, to reduce DB load and processing for mass use, cannot guarantee accuracy without shipping new versions.ImplementationsPlannedAA Drifters AA IncursionsInstallationRouting is an App forAlliance Auth, Please make sure you have this installed. Routing is not a standalone Django ApplicationStep 1 - Install apppipinstallaa-routingStep 2 - Configure Auth settingsConfigure your Auth settings (local.py) as follows:Add'routing'toINSTALLED_APPSAdd below lines to your settings file:Step 3 - Maintain Alliance AuthRun migrationspython manage.py migrateGather your staticfilespython manage.py collectstaticRestart your projectsupervisorctl restart myauth:Step 4 - ConfigurationIn the Admin interface, visitroutingor<AUTH-URL>/admin/routingSettingsNameDescriptionDefaultContributingMake sure you have signed theLicense Agreementby logging in athttps://developers.eveonline.combefore submitting any pull requests. All bug fixes or features must not include extra superfluous formatting changes.
aarpdf
No description available on PyPI.
aarpy
UNKNOWN
aarrr
No description available on PyPI.
aars
AARS - Aleph Active Record SDKAARS is a powerful and flexible Python library built on top of the Aleph decentralized storage network, designed to help you build better backends for your decentralized applications. It provides an easy-to-use interface for managing and querying your data, with a focus on performance and versatility.FeaturesAsynchronous, high-performance data storage and retrievalCustomizable schema with support for different data typesIndexing for efficient queryingRevision history tracking for recordsSupport for forgetting data (GDPR compliant)Built-in pagination for large result setsInstallationInstall AARS using pip:pipinstallaarsGetting StartedTo get started with AARS, you will need to define your data schema by creating classes that inherit from Record. These classes represent the objects you want to store and query on the Aleph network.Here's an example of how you can implement a simple social media platform, that we'll call "Chirper":fromsrc.aarsimportRecordclassUser(Record):username:strbio:strclassChirp(Record):author:Usercontent:strlikes:inttimestamp:intIn this example, we have a User class representing a user of Chirper, and a Chirp class representing a user's message. Now, let's create some indices to make querying our data more efficient:fromsrc.aarsimportIndexIndex(User,'username')Index(Chirp,'author')Index(Chirp,'timestamp')With the schema defined and indices created, we only need to initialize an AARS session:fromsrc.aarsimportAARSAARS()It is enough to call the constructor once, and it will automatically initialize the session with the default settings.We can now perform various operations, such as creating new records, querying records, and updating records:importtime# Create a new usernew_user=awaitUser(username='chirpy_user',display_name='Chirpy User',bio='I love chirping!').save()# Create a new chirpnew_chirp=awaitChirp(author=new_user,content='Hello, Chirper!',likes=0,timestamp=int(time.time())).save()# Query chirps by authorchirps_by_author=awaitChirp.filter(author=new_user).all()# Update a chirpnew_chirp.likes+=1updated_chirp=awaitnew_chirp.save()DocumentationFor detailed documentation, including advanced features such as revision history, forgetting data, and pagination, refer to the docs folder in the repository orvisit the official documentation website.Building the DocsTo build the documentation, you will need to install the dependencies listed in the requirements.txt and docs-requirements.txt. Then, run the following command:mkdocsbuildYou can serve the documentation locally by running the following command:mkdocsserveContributingContributions to AARS are welcome! If you have found a bug, want to suggest an improvement, or have a question, feel free to open an issue on the GitHub repository.LicenseAARS is released under theMIT License.
aa-rss-to-discord
Alliance Auth RSS to DiscordA simple app to post selected RSS feeds to your Discord.Alliance Auth RSS to DiscordInstallationStep 0.5: Install AA-DiscordbotStep 1: Install the PackageStep 2: Configure Alliance AuthStep 3: Finalizing the InstallationStep 4: Configure your RSS FeedsDiscord Bot CommandsUpdatingChangelogTranslation StatusContributingInstallationImportant: Please make sure you meet all preconditions before you proceed:Alliance Auth RSS to Discord is a plugin for Alliance Auth. If you don't have Alliance Auth running already, please install it first before proceeding. (see the officialAA installation guidefor details)Alliance Auth RSS to Discord needsAA-Discordbotto interact with your Discord server. Make sure it is installed and configuredbeforeinstalling this app.Step 0.5: Install AA-DiscordbotIn order for this app to work, you need to install and configureAA-Discordbotfirst. Read the instructions how to do so in the README of AA-Discordbot.Step 1: Install the PackageMake sure you're in the virtual environment (venv) of your Alliance Auth installation Then install the latest release directly from PyPi.pipinstallaa-rss-to-discordStep 2: Configure Alliance AuthThis is fairly simple, just add the following to theINSTALLED_APPSof yourlocal.pyConfigure your AA settings (local.py) as follows:Add"aa_rss_to_discord",toINSTALLED_APPSAdd the scheduled taskCELERYBEAT_SCHEDULE["aa_rss_to_discord_fetch_rss"]={"task":"aa_rss_to_discord.tasks.fetch_rss","schedule":crontab(minute="*/5"),}Step 3: Finalizing the InstallationRun migrations to finalize the installationpythonmanage.pymigrateFinally, restart your supervisor services for AA.Step 4: Configure your RSS FeedsFirst, you need to set up the Discord Server and Channels. For this, you go in your admin backend to the Discordbot settings and enter the needed information there.When done, you can set up your RSS feeds. This can be done in the setting of this app, still in your admin backend. Create a new RSS Feed entry, enter the name, url and select the Discord channel it should be posted to. Once done, save it.Discord Bot CommandsThe following commands are available for the Discord bot to manage RSS/Atom feeds:CommandOptionsWhat it does!rss_add <rss_url> <rss_name>-rss_url- The URL of the RSS/Atom feed-rss_name- A Name for the RSS/Atom FeedAdding a RSS/Atom feed to the current channel!rss_delete <rss_feed_id>rss_feed_id- The ID of the RSS/Atom feed you want to removeRemove a RSS/Atom feed from the current Discord channel!rss_disable <rss_feed_id>rss_feed_id- The ID of the RSS/Atom feed you want to disableDisable an enabled RSS/Atom feed for the current Discord channel!rss_enable <rss_feed_id>rss_feed_id- The ID of the RSS/Atom feed you want to enableEnable a disabled RSS/Atom feed for the current Discord channel!rss_listNoneList all RSS/Atom feeds for the current Discord channelUpdatingTo update your existing installation of Alliance Auth RSS to Discord, first enable your virtual environment (venv) of your Alliance Auth installation.pipinstall-Uaa-rss-to-discord pythonmanage.pymigrateFinally, restart your supervisor services for AA.ChangelogSeeCHANGELOG.mdTranslation StatusDo you want to help translate this app into your language or improve the existing translation? -Join our team of translators!ContributingDo you want to contribute to this project? That's cool!Please make sure to read theContribution Guidelines.(I promise, it's not much, just some basics)
aart
Adaptive Analytical Ray Tracing (AART)AART is a numerical framework that exploits the integrability properties of Kerr spacetime to compute high-resolution black hole images and their visibility amplitude on long interferometric baselines. It implements a non-uniform adaptive grid in the image plane suitable to study black hole photon rings (narrow ring-shaped features predicted by general relativity but not yet observed).The code, described in detail in Ref. [1], implements all the relevant equations required to compute the appearance of equatorial sources on the (far) observer's screen. We refer the reader to Refs. [2-4] for derivations and further details. Through the code, the equations are mentioned as Pi Eq. N, which means Eq. N in Ref. [i].Please refer to the AART homepage found under the "Project links" header for an example Jupyter notebook in which use of the AART package is illustrated.The use of AART in scientific publications must be properly acknowledged. Please cite:Cardenas-Avendano, A., Lupsasca, A. & Zhu, H. "Adaptive Analytical Ray Tracing of Black Hole Photon Rings."arXiv:2211.07469We also request that AART modifications or extensions leading to a scientific publication be made public as free software.Feel free to use images and movies produced with this code (with attribution) for your next presentation!Last updated: 08.24.2023AART's ComponentsLensing Bands: the functions are located inlensingbands.py. This module computes Bardeen's coordinates inside the so-called lensing bands. Currently it only computes $0\le n\le 2$, but extension to a higher $n$ is possible: just copy the structure of the code and add the desired number $n$ on a Cartesian grid with different resolutions.Analytical Ray-Tracing: the main functions are located inraytracing_f, while the master function is located inraytracing.py. For a given location in Bardeen's plane ($\alpha,\beta$), it computes where it lands in the equatorial plane ($t,r,\theta=\pi/2,\phi$) in Boyer-Lindquist coordinates. The implementation does this per lensing band.Images: the source functions are located iniImages.py. It computes an image for a given analytical illumination profile specified in its arguments, if it is purely radial and analytical, or as an external file. The current version of the code supportsinoisy(https://arxiv.org/abs/2011.07151) outputs, where the external file is an HDF5 with a specific structure. Below you can find a low-resolution example.Visibility Amplitudes: the main functions are located invisamp_f.py, while the master function is located invisamp.py. It computes the visibility amplitudes for given intensities over $n$ lensing bands.Polarization: the main functions are located inpolarization_f.py, while the master function is located inpolarization.py. For a given magnetic field configuration specified in the arguments, it parallel transports the linear polarization of a photon.DependenciesPython Libraries:All the dependencies are located in theinit.pyfile. Most of the libraries will come natively with anaconda (e.g., numpy, scipy >=1.8, matplotlib, multiprocessing, skimage) but some may not.To install any missing packages, runpip install "package_name"or, if using anaconda, search for the missing packages and run, e.g. for h5py (read and write HDF5 files from Python),conda install -c anaconda h5pySometimes scipy does not automatically update to the latest version. If that is the case, you should runpip install -U scipyHow to run AARTThe parameters are set in the individual functions.We present some examples in the notebookExamples.ipynb.Lensing Bands:The lensing bands are computed by runningpython aart.lensingbands.clb(spins, angles)wherespins, andanglesare arrays. For example,spins=[0.5, 0.6], andangles=[17, 50]would result in the calculation of the lensing bands for the four spin-angle combinations(a, i):(0.5, 17), (0.5, 50), (0.6, 17), and (0.6, 50).The result of thelensingbandsfunction will be stored in a HDF5 file that contains the values of the Bardeen's coordinates within each lensing band. The datasets inside the resulting file are:alpha: the coordinate alpha of the critical curve. The parameternpointsScontrols the number of points used for the computation of the critical curve)beta: The coordinate beta of the critical curvehull_ni: the points for the inner convex hull of the $n$th band. Note that hull_0i corresponds to the location of the apparent horizon. hull_ne: the points for the outer convex hull of the $n$th band. Note that hull_0e corresponds to edges of the domain.gridn: the point within the $n$th lensing bandNn: number of points within the $n$th lensing bandlimn: the grids are cartesian and symmetric around zero. This dataset indicates the limits of the grid.This image is produced in the example code:Note that there are more arguments in thelensingbandsfunction. These can be identified by runningimport inspect inspect.getfullargspec(aart.lensingbands.clb)which will print out all the arguments that the functionlensingbandstakes. In the rest of the readme the full range of arguments that each function takes will not be listed; the above code can be run for any of the functions to find the range of arguments, while their descriptions can be found in the respective python files for the so-called master functions.Ray Tracing:To compute the equatorial radius, angle, and emission time of a photon, we perform a backward ray-tracing from the observer plane. By running the following, we evaluate the source radius, angle, and time within the grid from each lensing band:python aart.raytracing.raytrace(spins, angles)The result will be stored in a HDF5 file that contains source radius, angle, time, as well as the radial component of the four momentum in the equatorial plane, for lensing bands n=0,1,2. The datasets inside the resulting file are:rsn: the value of the $r$ Boyer-Lindquist coordinate for the $n$th lensing band. It follows the order of the lensing bandtn: the value of the $t$ Boyer-Lindquist coordinate for the $n$th lensing band. It follows the order of the lensing bandphin: the value of the $\phi$ Boyer-Lindquist coordinate for the $n$th lensing band. It follows the order of the lensing bandsignn: the sign of the radial momentum of the emitted photon in the $n$th lensing band.This image is produced in the example code:Images:Stationary and axisymmetric source profiles:Once the lensing bands and the rays have been computed, an image can be produced using a specified analytical profile by runningdef mu_p(spin_case): return 1-np.sqrt(1-spin_case**2) python aart.radialintensity.cintensity(spins, angles, mu_p=mu_p, gammap=-3/2, sigmap=1/2, gfactor=3.0)where mu_p, gammap, and sigmap specify the analytical profile in according to Johnson’s SU distribution and we've chosen specific values as an illustration.The datasets inside the resulting file are:bghtsn: the intensity at each point in the image.This image is produced in the example code:Non-stationary and non-axisymmetric source profiles:As the dataset produced after ray tracing contains all the information of the BL coordinates, one can also use analytical non-stationary and non-axisymmetric source profiles to produce images (that use the entire history of the profile) and movies.One can also use a precomputed equatorial profile. AART currently implements profiles computed with inoisy. The example includes a test case (inoisy.h5), for which one can runpython aart.iImages.image(spins, angles)orpython aart.iMovies.movie(spins, angles)to produce an image or a set of images, respectively. Images can be produced by using a single equatorial profile, i.e., in the mode "stationary," or using the entire history of the equatorial structure, i.e, in the mode "dynamical." When movies are made, the dynamical version is assumed. In both cases, the resulting datasets inside the resulting file are:bghtsn: the intensity at each point in the image. When several snapshots are produced, these datasets will have three dimensions, where the first one denotes the time.This gif is produced in the example code:Visibility Amplitudes:With the images created using radial intensity profiles, one may then calculate the visibility of the image projected onto a baseline by runningpython aart.visamp.cvisamp(spins, angles, radonangles=radonangles)This function first performs radon transforms of the image at a set of specified angles (theradonanglesargument in some of the above functions), and then computes the visibility amplitude. The output is a set of h5 files, one for each baseline angle. These files contain the visibility amplitude as well as the frequency (baseline length in G$\lambda$). The resulting datasets inside the resulting file are:freqs: the frequencies over which the visibility amplitudes were computedvisamp: the respective visbility amplitudes.If in the functioncvisampthe argumentradonfile=1, the HD5F file will also contain these two datasets:radon: the resulting radon transformationx_radon: the axis values of the projection.This image is produced in the example code:Polarization:The linear polarization of a given configuration of the magnetic field can be computed by runningpython aart.polarization.polarization.(spins, angles)The resulting datasets inside the resulting file are:PK: the Walker-Penrose constantEVPA_x: the x-component of the the electric-vector position angleEVPA_y: the y-component of the the electric-vector position angle.Limitations and known possible performance bottlenecksThis code has only been tested on Mac OS (M1 and Intel) and on Ubuntu.If you want to run a retrograde disk, you will have to apply symmetry arguments. In other words, run the positive spin case ($-a$), flip the resulting lensing bands and rays, and then compute the intensity on each pixel. Note that the characteristic radii also need to be modified. We plan to add this feature in a future version.The radon cut does not smoothly go to zero. This is sometimes clear from the visamp, where you can see an extra periodicity (wiggle) on each local maxima. To solve this issue, increase the FOV of the $n=0$ image by providing a larger value for the argumentlimitsin the relevant functions. You can also modify the percentage of points used innpointsfitinvisamp_f.py.Producing the lensing bands can take too long, in particular, for some larger inclination values computing the contours of the lensing bands and the points within it takes a long time. The calculation can be made faster at the cost of some accuracy if you decrease the number of points used to compute the contours, i.e., by decreasing the value of the argumentnpointsS. It is faster to compute the convex hull instead of the concave hull (alpha shape), but then you will have to check that you are not missing points (having extra points is not an issue with the analytical formulae, as the results are masked out). If using the convex hull is acceptable, then you can also change the functionin_hullinlensingbands.pyto usehull.find_simplexinstead ofcontains_points.AuthorsAlejandro Cardenas-Avendano ([email protected])Hengrui ZhuAlex LupsascaLennox Keeble (python package creation and maintenance)References[1] Cardenas-Avendano, A., Lupsasca, A. & Zhu, H. Adaptive Analytical Ray Tracing of Black Hole Photon Rings.arXiv:2211.07469[2] Gralla, S. E., & Lupsasca, A. (2020). Lensing by Kerr black holes. Physical Review D, 101(4), 044031.[3] Gralla, S. E., & Lupsasca, A. (2020). Null geodesics of the Kerr exterior. Physical Review D, 101(4), 044032.[4] Gralla, S. E., Lupsasca, A., & Marrone, D. P. (2020). The shape of the black hole photon ring: A precise test of strong-field general relativity. Physical Review D, 102(12), 124004.MIT LicensePermission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
aartfaac-arthur
A set of tools for building AARTFAAC visualisations.Things like:A fast and simple imagerTools for streaming images to a rtsm server (youtube)Tools for visualising all kind of other meta datarequirementsPython 3, but python 2 mostly works.Why ARTHUR?because Arthur is the anthropomorphic AARTFAACNotesThis repo uses git lfs. If you want to run the test suite dont forget to unzip the tarball intest/data.
aas2openapi
aas2openapi - Middleware for Asset Administration Shell and openAPI 3.0aas2openapi is a middleware for Asset Administration Shell (AAS) and openAPI 3.0. It can be used to transform AAS to openAPI 3.0 objects and vice versa. Moreover, it can be used to generate a CRUD server that allows to access the AAS data via RESTful API with openAPI Specifications.InstallationYou can install the package using by running the following command in the terminal:pipinstallaas2openapiAlternatively, you can install the package withpoetryfor development:poetryshell poetryinstallPlease note that the package is only compatible with Python 3.10 or higher.Getting startedIn the following, we will consider a minimal example to demonstrate the usage of the package. The example is also available in theexamplesand consists of defining a simple model of an AAS with two submodels, transforming this model and integrating it with the aas2openapi middleware.Defining a simple AASAt first, we create a simple data model with the basic building blocks (AAS and Submodel) of aas2openapi:classBillOfMaterialInfo(models.SubmodelElementCollection):manufacterer:strproduct_type:strclassBillOfMaterial(models.Submodel):components:typing.List[str]bill_of_material_info:BillOfMaterialInfoclassProcessModel(models.Submodel):processes:typing.List[str]classProduct(models.AAS):bill_of_material:BillOfMaterialprocess_model:typing.Optional[ProcessModel]The data model consists of a product that has a process model and a bill of material. The process model and the bill of material are submodels that contain a list of processes and components, respectively. To be able to instantiate an AAS, we also create an instance of this data model:example_product=Product(id="Product1",process_model=ProcessModel(id="PMP1",processes=["join","screw"],semantic_id="PMP1_semantic_id",),bill_of_material=BillOfMaterial(id="BOMP1",components=["stator","rotor","coil","bearing"],semantic_id="BOMP1_semantic_id",bill_of_material_info=BillOfMaterialInfo(id_short="BOMInfoP1",semantic_id="BOMInfoP1_semantic_id",manufacterer="Bosch",product_type="A542",)),)Transforming the AAS to openAPINow, we can useaas2openapito transform the object to an AAS and serialize it with basyx to JSON:importaas2openapiobj_store=aas2openapi.convert_pydantic_model_to_aas(example_product)importbasyx.aas.adapter.json.json_serializationwithopen("examples/simple_aas_and_submodels.json","w",encoding="utf-8")asjson_file:basyx.aas.adapter.json.write_aas_json_file(json_file,obj_store)Of course, we can also transform AAS to python objects, which can be easily transformed to openAPI objects with the middleware:data_model=aas2openapi.convert_object_store_to_pydantic_models(obj_store)print(data_model)Using the aasopenapi middlewareThe aas2openapi middleware is build with fastAPI and generates therefore automatically a openAPI Specification for provided models. To use the middleware, we can simply input our predefined data models and connect it with running basyx servers. Here, we use instances of the data models that we have defined before and create a CRUD RESTful API for them:fromaas2openapi.middlewareimportMiddlewaremiddleware=Middleware()middleware.load_pydantic_model_instances([example_product])middleware.generate_rest_api()middleware.generate_graphql_api()middleware.generate_model_registry_api()app=middleware.appWe can run this app either from the command line with uvicorn:uvicornapp:app--reloadHere, it is assumed, that the file you saved is called app.py. If it has a different name, rename the first app. Alternatively, you can also run the app directly with python, by extending the script at the bottom with:if__name__=="__main__":importuvicornuvicorn.run(app)You can access now the documentation of the REST API with swagger athttp://localhost:8000/docsand the graphql endpoint athttp://localhost:8000/graphql.Besides loading the data models from instances, we can also generate it directly from its python type, from a JSON-object or from an AAS. To do so, simply replace theload_pydantic_model_instancesmethod withload_pydantic_modelsto load from types orload_aas_objectstoreto load from an AAS object store orload_json_modelsto load from serialized JSON-objects:middleware.load_pydantic_model_types([Product])middleware.load_aas_from_objectstore(obj_store)middleware.load_json_models(file_path="examples/example_json_model.json")However, no examples can be provided when loading from types.We can either run the middleware now directly with python or make a docker build. In both scenarios, an AAS and Submodel-server need to be running that the middleware can connect to. Each request to the middleware is then translated and forwarded to the AAS and Submodel-server.The repository already comes with a docker-compose file that can be used to start the AAS and Submodel-server. To start the docker-compose file, run the following command in the terminal:docker-compose-fdocker-compose-dev.yamlupWe can now run the middleware script with python and access it athttp://localhost:8000/. Documentation of the generated Rest API is athttp://localhost:8000/docsavailable by a swaggerUI and the openAPI Specification is available athttp://localhost:8000/openapi.json. You can use the swaggerUI to post some AAS or this exemplary command (with bash console):curl-X'POST'\'http://127.0.0.1:8000/Product/'\-H'accept: application/json'\-H'Content-Type: application/json'\-d'{"id_short": "Product1","description": "string","id": "Product1","bill_of_material": {"id_short": "BOMP1","description": "","id": "BOMP1","semantic_id": "BOMP1_semantic_id","components": ["stator","rotor","coil","bearing"],"bill_of_material_info": {"id_short": "BOMInfoP1","description": "","semantic_id": "BOMInfoP1_semantic_id","manufacterer": "Bosch","product_type": "A542"}},"process_model": {"id_short": "PMP1","description": "","id": "PMP1","semantic_id": "PMP1_semantic_id","processes": ["join","screw"]}}'If you want to change the adresses and ports of the AAS and Submodel-server, you can do so by adding a.envfile to the root directory of the package. The file should contain speicifications similar to that in the.env.examplefile of the package.Lastly, we can build a docker image of the middleware and run it in a docker-compose as a container. To do so, just adjust the provided Dockerfile and docker-compose.yaml of this package to fit your needs based on the provided example in the file docker_app.py. To build the docker image, run the following command in the terminal:dockercomposebuildlastly, we can run the docker-compose file with the following command and start aas2openapi, AAS server and submodel server at the same time:dockercomposeupIf you want to start the middleware also with a user interface for the AAS and a mongo db backend for AAS, you can use the following command:dockercompose-fdocker-compose-ui.yamlup-dThe user interface is then available athttp://localhost:3000/. After posting some AAS with the middleware (see above), you can add the AAS to the user interface by clicking on the "+" button and entering the AAS id (base64 encoded). The AAS is then available in the user interface.Contributingaas2openapiis a new project and has therefore much room for improvement. Therefore, it would be a pleasure to get feedback or support! If you want to contribute to the package, either create issues onaas2openapis github pagefor discussing new features or contact me directly viagithuboremail.LicenseThe package is licensed under theMIT license.AcknowledgementsWe extend our sincere thanks to the German Federal Ministry for Economic Affairs and Climate Action (BMWK) for supporting this research project 13IK001ZF “Software-Defined Manufacturing for the automotive and supplying industryhttps://www.sdm4fzi.de/.
aas2openapi-client
aas2openapi-clientA client library for accessing aas2openapiUsageFirst, create a client:fromaas2openapi_clientimportClientclient=Client(base_url="https://api.example.com")If the endpoints you're going to hit require authentication, useAuthenticatedClientinstead:fromaas2openapi_clientimportAuthenticatedClientclient=AuthenticatedClient(base_url="https://api.example.com",token="SuperSecretToken")Now call your endpoint and use your models:fromaas2openapi_client.modelsimportMyDataModelfromaas2openapi_client.api.my_tagimportget_my_data_modelfromaas2openapi_client.typesimportResponsemy_data:MyDataModel=get_my_data_model.sync(client=client)# or if you need more info (e.g. status_code)response:Response[MyDataModel]=get_my_data_model.sync_detailed(client=client)Or do the same thing with an async version:fromaas2openapi_client.modelsimportMyDataModelfromaas2openapi_client.api.my_tagimportget_my_data_modelfromaas2openapi_client.typesimportResponsemy_data:MyDataModel=awaitget_my_data_model.asyncio(client=client)response:Response[MyDataModel]=awaitget_my_data_model.asyncio_detailed(client=client)By default, when you're calling an HTTPS API it will attempt to verify that SSL is working correctly. Using certificate verification is highly recommended most of the time, but sometimes you may need to authenticate to a server (especially an internal server) using a custom certificate bundle.client=AuthenticatedClient(base_url="https://internal_api.example.com",token="SuperSecretToken",verify_ssl="/path/to/certificate_bundle.pem",)You can also disable certificate validation altogether, but beware thatthis is a security risk.client=AuthenticatedClient(base_url="https://internal_api.example.com",token="SuperSecretToken",verify_ssl=False)There are more settings on the generatedClientclass which let you control more runtime behavior, check out the docstring on that class for more info.Things to know:Every path/method combo becomes a Python module with four functions:sync: Blocking request that returns parsed data (if successful) orNonesync_detailed: Blocking request that always returns aRequest, optionally withparsedset if the request was successful.asyncio: Likesyncbut async instead of blockingasyncio_detailed: Likesync_detailedbut async instead of blockingAll path/query params, and bodies become method arguments.If your endpoint had any tags on it, the first tag will be used as a module name for the function (my_tag above)Any endpoint which did not have a tag will be inaas2openapi_client.api.defaultBuilding / publishing this ClientThis project usesPoetryto manage dependencies and packaging. Here are the basics:Update the metadata in pyproject.toml (e.g. authors, version)If you're using a private repository, configure it with Poetrypoetry config repositories.<your-repository-name> <url-to-your-repository>poetry config http-basic.<your-repository-name> <username> <password>Publish the client withpoetry publish --build -r <your-repository-name>or, if for public PyPI, justpoetry publish --buildIf you want to install this client into another project without publishing it (e.g. for development) then:If that projectis using Poetry, you can simply dopoetry add <path-to-this-client>from that projectIf that project is not using Poetry:Build a wheel withpoetry build -f wheelInstall that wheel from the other projectpip install <path-to-wheel>
aasaanjobs-notificationhub
Python Aasaanjobs/Waahjobs NotificationHub ClientPython SDK to communicate with Aasaanjobs/Waahjobs Notification Hub and send notifications to users.Supported Notification ChannelsShort Messaging Service (SMS)EmailWhatsAppMobile Push (FCM)Installationpip install aasaanjobs-notificationhubUsageEach notification is referred to asTaskin this library. A singleTaskcan contain multiple channels, i.e., a singleTaskcan contain bothEmailandWhatsAppnotification data. ThisTaskis then validated viaProtocol Buffersand pushed to corresponding Notification Hub Amazon SQS queue.ForTransactionalnotificationsNOTIFICATION_HUB_SQS_QUEUE_NAMEenvironment variable should be configured.ForMarketingnotificationsNOTIFICATION_HUB_MARKETING_SQS_QUEUE_NAMEenvironment variable should be configured.ForOTPnotificationsNOTIFICATION_HUB_OTP_SQS_QUEUE_NAMEenvironment variable should be configured.ConfigurationEach application which uses this library must configure Amazon SQS configurations to successfully send notification task to Hub.The following keys can be defined in the settings module if Django application or can be defined as environment variablesSettingDescriptionNOTIFICATION_HUB_SQS_ACCESS_KEY_IDAccess Key of the IAM role which has access to the Hub SQSNOTIFICATION_HUB_SQS_SECRET_ACCESS_KEYSecret Access Key of the IAM role which has access to the Hub SQSNOTIFICATION_HUB_SQS_REGIONAWS Region where the Hub SQS residesNOTIFICATION_HUB_SQS_QUEUE_NAMEName of the Hub SQS QueueNOTIFICATION_HUB_MARKETING_SQS_QUEUE_NAMEName of the Hub Marketing SQS QueueNOTIFICATION_HUB_OTP_SQS_QUEUE_NAMEName of the Hub OTP SQS Queue
aasalert
AASJobAlertAASJobAlert is a Python script for checking the AAS Job Register.Some code borrowed from thistutorial.InstallationUse the package managerpipto install aasalert.pipinstallaasalertUsageusage:aasalert[-h]keywordSearchtheAASjobregisterforpost-docpositionsinthelastyear. positionalarguments:keywordKeywordtosearchtheAASjobregisterfor. optionalarguments:-h,--helpshowthishelpmessageandexitThis is best used as a cronjob for regular checks of AAS. We can usemailxto generate a email alert. For example:crontab-e0000**1aasalertkeyword2>&1|mailx-E-s"AAS Job Alert"[email protected] linkfor examples of how to use crontab.ContributingPull requests are welcome.LicenseMIT
aasare
# aasare An ambitious templater project
aa-sbst
There is no implemenation of SBST in Standard Python Library, and I found it quite inconvenient and a little bit disappointing.This is a compact, portable (no dependencies) and extremely easy-to-use implementation of self-balancing binary search tree. This particular type of trees (so called AA-tree) is described here:https://en.wikipedia.org/wiki/AA_treeFeatures:You can use this module throughimportinstruction or simply copy-paste the implementation into your source code, and be happy.While instantiatingsbstobject you can specify your own comparison function or use default simple comparison.You can add values to tree one-by-one using functionadd, or fill it from some iterable object (functionaddfrom). Either initialization in constructor is possible.The tree stores all duplicates. This feature is vital if the tree is an index for in-memory table.This SBST gives you two basic search operations:min- returns minimal value that is not less (ifinclusiveparameter is True) or greater (inclusive=False) than specified limit.max- returns maximal value that is not greater (ifinclusiveparameter is True) or less (inclusive=False) than specified limit.If you have not specified limit, functions return respectively minimal or maximal value in the tree.Functionforward_fromreturns generator that yields sorted sequence of values starting from a specified value. Functionbackward_fromyields reverse-sorted sequence down from a specified value. These functions haveinclusiveoption too. If starting value is not specified, these functions yield respectively sorted or reverse-sorted sequences of all values in the tree. If tree modified while iterating (some values inserted, some removed, tree rebalanced), sequence will be yielded in right predictable way.If comparison function treats values as equal, they will be yielded byforward_fromandbackward_fromgenerators in the insertion order.Do not store _None_ values into tree. Even if your comparison function can process them, you will not be able to search them because None value will be treated as ‘not specified’.If mutable objects inserted into the tree are changed, their sequence in tree may become irrelevant. So after value mutation it is a good idea to remove it from tree and add again.Methodsaddandremoveare not thread-safe. Be careful.Tutorial: [doc/tutorial.md](doc/tutorial.md)Keywords: binary tree Platform: UNKNOWN Classifier: Development Status :: 4 - Beta Classifier: Intended Audience :: Developers Classifier: Intended Audience :: Education Classifier: Topic :: Software Development :: Libraries Classifier: License :: CC0 1.0 Universal (CC0 1.0) Public Domain Dedication Classifier: Programming Language :: Python :: 3 Classifier: Programming Language :: Python :: 3.4 Classifier: Programming Language :: Python :: 3.5 Classifier: Programming Language :: Python :: 3.6 Description-Content-Type: text/markdown
aas-core3.0
This is a software development kit (SDK) to manipulate, verify and de/serialize Asset Administration Shells based on the version 3.0 of the meta-model.The documentation is available on:https://aas-core30-python.readthedocs.io/en/latest/.The majority of the code has been automatically generated byaas-core-codegen.If you want to contribute, see ourcontributing guide.
aas-core3.0rc02
This is a software development kit (SDK) to manipulate, verify and de/serialize Asset Administration Shells based on the version 3.0VRC02 of the meta-model.The documentation is available on:https://aas-core30rc02-python.readthedocs.io/en/latest/.The majority of the code has been automatically generated byaas-core-codegen.If you want to contribute, see ourcontributing guide.
aas-core3.0rc02-testgen
This is a bundle of programs to generate test data based on the meta-model V3.0RC02.Please have a look at the directorytest_data/for the generated data.
aas-core-codegen
Aas-core-codegen:generates code for different programming environments and schemasto handle asset administration shellsbased on the meta-model in simplified Python.MotivationThe meta-model is still at the stage where it changes frequently. However, we need SDKs in different languages (C#, C++, C, Java, Golang, Erlangetc.) as well as different schemas (JSON Schema, XSD, RDFetc.). Keeping up with the changes is hard, time-consuming and error-prone aseachSDK and schema needs to be reviewed independently.To make the whole development cycle of the meta-model, SDKs and schemas more maintainable, we wrote a code and schema generator. We write a meta-model in a subset of Python language, parse it and, based on this meta-model, generate the code for different languages and schemas.Therefore we can easily scale to many languages and schemas.Here is a diagram to illustrate the whole process:Warning about StabilityWhile we aim for long-term stability of the generators, mind that the current version of the meta-model, version 3 release candidate 2 (V3RC2) is in too much flux to make any solid claims about the short-term stability.For example, not even the set of basic types is still defined, and there is an on-going discussion in the UAG Verwaltungsschale what this set might be. Same holds about the definitions of references and how we should deal with them.Moreover, the serialization approaches are not finalized either. For example, the current JSON schema does not allow for one-pass serialization (a.k.a. streaming-based serialization). We are discussing in UAG Verwaltungsschale to use JSON tuples with the model type as prefix instead of JSON objects, but this discussion is still at an early stage.As long as V3RC2 does not stabilize, consider the generated code and schemas to be insufficient for any serious use (either experimental or in production).InstallationSingle-File ReleasePlease download and unzip the latest release fromthe GitHub release page.From PyPIThe tool is also available onPyPI.Create a virtual environment:python -m venv venv-aas-core-codegenActivate it (in Windows):venv-venv-aas-core-codegen\Scripts\activateor in Linux and OS X:source venv-aas-core-codegen/bin/activateInstall the tool in the virtual environment:pip3 install aas-core-codegenUsageWrite your meta-model somewhere as well as the code snippets for implementation specific classes and functions. For example, take ourtest meta-modelfor inspiration how to write the meta-model and the snippets.Make sure you are within the virtual environment where you installed the generator. Alternatively, if you are using the binary release, make sure the release is on your path.Call the generator with the appropriate target:aas-core-codegen \ --model_path path/to/meta_model.py \ --snippets_dir path/to/snippets \ --output_dir path/to/output \ --target csharp--helpusage: aas-core-codegen [-h] --model_path MODEL_PATH --snippets_dir SNIPPETS_DIR --output_dir OUTPUT_DIR --target {csharp,jsonschema,rdf_shacl,xsd} [--version] Generate different implementations and schemas based on an AAS meta-model. optional arguments: -h, --help show this help message and exit --model_path MODEL_PATH path to the meta-model --snippets_dir SNIPPETS_DIR path to the directory containing implementation- specific code snippets --output_dir OUTPUT_DIR path to the generated code --target {csharp,jsonschema,rdf_shacl,xsd} target language or schema --version show the current version and exitVersioningWe are still not clear about how to version the generator. For the moment, we use a lax incremental versioning with0.0prefix (0.0.1, 0.0.2``)etc.The changelog is available inCHANGELOG.rst.ContributingFeature requests or bug reports are always very, very welcome!Please see quickly if the issue does not already exist in theissue sectionand, if not, createa new issue.Contributions in code are also welcome! Please seeCONTRIBUTING.rstfor developing guidelines.
aas-core-meta
We provide here meta-models for asset administration shell information model written in simplified Python.For more information and a more abstract overview, please see our design documents:https://aas-core-works.github.io/design-docs/ContributorsNico BraunischMarko RistinRobert LehmannMarcin SadurskiManuel SauerSebastian BaderContributingFeature requests or bug reports are always very, very welcome!Please see quickly if the issue does not already exist in theissue sectionand, if not, createa new issue.You can also contribute in code. Please seeCONTRIBUTING.rst.VersioningSince this package keeps track of multiple versions of the meta-model, we opt to release following a simple release date schema.For example, for a release on November 20th 2021, we use the version:2021.11.20.
aascraw
No description available on PyPI.
aasdfg
No description available on PyPI.
aa-secret-santa
AA Secret SantaA Secret Santa Manager forAlliance AuthFeaturesAccepts Applications to be secret santasHandles randomly pairing up usersNotifies users of their santeehandles if gifts have been deliveredSecure Groups IntegrationInstallationStep 1 - Install apppipinstallaa-secret-santaStep 2 - Configure Auth settingsConfigure your Auth settings (local.py) as follows:Add'secretsanta'toINSTALLED_APPSStep 4 - Maintain Alliance AuthRun migrationspython manage.py migrateGather your staticfilespython manage.py collectstaticRestart your projectsupervisorctl restart myauth:Step 5 - ConfigurationIn the Admin interface, visitsecretsantaor<AUTH-URL>/admin/secretsanta# Coming SoonPermissionsPermAdmin SitePermDescriptionbasic_accessnillCan access Secret SantaCan access the Secret Santa Module and Apply to YearsmanagernillCan manage Secret SantaCan Manage and See all Santa<>Santee PairsSettingsNameDescriptionDefaultSECRETSANTA_GENERATE_PAIRS_PRIORITYthe generate_pairs task, to run super uber omega immediately so we can identify issues1SECRETSANTA_NOTIFY_PRIORITYPriority for discord messages for secret santa5ContributingMake sure you have signed theLicense Agreementby logging in athttps://developers.eveonline.combefore submitting any pull requests. All bug fixes or features must not include extra superfluous formatting changes.
aasemble
UNKNOWN
aasemble.deployment
UNKNOWN
aash
aashInstallationpipinstallaashUsagefromaashimportaashaash()
aashikacalc
This is a very simple Calculator that takes two numbers as input and perform basic Arithmetic operations like Addition, Subtraction, Multiplication and DivisionChange Log0.0.1(12/12/2022)-First Release
aashpdf
This is the homepage of our project
aashupdf
This is the homepage of our project.
aa-simplewiki
SimpleWikiA simple wiki plugin for alliance auth. It supports multiple pages with different sections. Every page and every section can have their own icon. Written in Python with Django.ContentsCurrent FeaturesTODOPlannedScreenshotsInstallationAlliance Auth ProductionNon-Docker VersionDocker VersionAlliance Auth DevelopmentUsagePermissionsSupportCurrent FeaturesCreate custom wiki menus with dropdowns plus different sections on each menuAdd an icon next to menus or sectionsEdit pages on the admin panel withmarkdownSupport for all major GitHub markdown featuresSupport for videos (YouTube and Vimeo)Support for alerts and Google DriveSearch function: Search across all wiki menus and sectionsPermission system:Editor permissions can be added to single users or groups via the admin panelAlliance Auth groups are synced to simplewiki: Certain pages can only be seen by a specific groupMultiple groups support: As long as the user is in any of the required groups, they can access the menuEditor interfaceUsers with editor permission can create, edit and delete custom menus and sections (editor_access)Users with editor permission see edit and delete buttons above all sections (editor_access)Change menu position's with a drag and drop systemTODO:Quality-of-life updatesImplement better loggingImprove code documentationMarkdownButtonsPlannedAdd translations forGermanSpanishChineseRussianKoreanFrenchItalianActive devs:MeowosaurusScreenshotsSearchAdmin PanelInstallationAlliance Auth ProductionNon-Docker Version1.) Install the pip package viapip install aa-simplewiki2.) Addsimplewikito yourINSTALLED_APPSin your projectslocal.py3.) Restart your server, then run migrations and collectstaticDocker Version1.) Please make sure you followed the custom docker-image tutorialhere:2.) Edit yourconf/requirementsand add the following lineaa-simplewiki(Checkhttps://pypi.org/project/aa-simplewiki/for different versions!)3.) Addsimplewikito yourINSTALLED_APPSin your projectslocal.py4.) Start your serverdocker compose --env-file=.env up -d5.) Rundocker compose exec allianceauth bash7.) Runauth migrate8.) Runauth collectstaticAlliance Auth DevelopmentMake sure you have installed alliance auth in the correct way:https://allianceauth.readthedocs.io/en/latest/development/dev_setup/index.html1.) Download the repogit clone https://github.com/meowosaurus/aa-simplewiki2.) Make sure it's under the root folderaa-dev, not undermyauth3.) Change directory intoaa-devaand runpip install -e aa-simplewikiImportant: If you are getting an error saying thatsimplewikiis not installed after runningpip install -e aa-simplewiki, delete thesetup.pyfile in the aa-simplewiki root directory and try again.4.) Addsimplewikito yourINSTALLED_APPSin your projectslocal.py5.) Change directory intomyauth6.) Make migrations withpython manage.py makemigrations7.) Migrate withpython manage.py migrate8.) Restart auth withpython manage.py runserverUsageCheck out our wiki on GitHub:https://github.com/meowosaurus/aa-simplewiki/wiki1.) Go to{your_auth_url}/admin-> SimpleWiki -> Add Menu Item2.) Give it a title, an index (menu items are sorted by their index from low to high) and a name (the name is the name in the url) and hit save.3.) Go to{your_auth_url}/admin-> SimpleWiki -> Add Page Item4.) Give it a title, a menu page (this is the menu page it will be under), an index (ordered from low to high) and a content description. This description will be the main content and you can use HTML. Hit save.5.) Go back to your main auth page, go under Wiki and you've created your first menu and page.PermissionsPermAdmin SiteAuth Sitebasic_accessNoneCan view all wiki pageseditor_accessNoneCan create, edit and delete wiki pages and menusCommandsMigrate all data from 1.0.x to 1.1.1 to use new model system:python manage.py simplewiki_migrate_v1_1Migrate section data from 1.1.1 and later to 1.1.3 to add author details:python manage.py simplewiki_migrate_v1_3DependenciesAlliance Authallianceauth-app-utilsMistunejQueryNestableSupportOn Discord: meowlicious
aas-job-register-downloader
AAS Job Register DownloaderThe AAS Job register (https://jobregister.aas.org/) has a lot of cool jobs but these are a bit challenging to sift through.This tries to collect some data from the site and tabulate it. I hope it helps you in your search!To usepip install aas_job_register_downloader download_aas_jobsIf you want the job descriptions to be added in the table:download_aas_jobs -v
aasm
Agents Assembly TranslatorTable of ContentsAboutAgents AssemblyGetting StartedStructureContributingAboutA target agnostic translator for Agents Assembly. The translator can be tested live onAgents Assembly website. It is a part of theAgents Assemblyecosystem. Other applications are:Local Interface- GUI for simulation definition, management, and analysis.Simulation Run Environment- environment for running scalable agent-based simulations.Communication Server- cluster of servers used for XMPP communication.Local Development Environment- simple environment for running agent-based simulations.Agents AssemblyDocumentation of Agents Assembly can be readhere.Getting StartedPrerequisitesPython 3.10InstallationThe translatorpackagecan be installed by running:pip install aasmAlternatively, you can download this repository. No additional dependencies are required.UsageYou can run the translator as a package. To translateagent.aasmto SPADE, run:python -m aasm.translate agent.aasmFor more usage information, run:python -m aasm.translate --helpStructuregeneratingcode.py- generated codepython_code.py- Python code base classpython_graph.py- Python graph code generation from the intermediate representationpython_spade.py- SPADE agent code generation from the intermediate representationintermediateaction.pyagent.pyargument.py- arguments used in instructionsbehaviour.pyblock.py- block of code representationdeclaration.py- declarations used in actionsgraph.pyinstruction.py- instructions used in actionsmessage.pyparsingparse.py- parsing environment from Agents Assembly fileop/- Agents Assembly operationsstate.py- state definition used for the parsing processpreprocessorconstants.py- constants used in the preprocessormacro.py- macro definitions used in the preprocessorpreprocessor_item.py- preprocessor base itempreprocessor.pyutilsexception.pyvalidation.pyiteration.pytranslate.py- entrypointContributingPlease follow thecontributing guideif you wish to contribute to the project.
aasms
UNKNOWN
aa-sov-timer
AA Sovereignty TimerSovereignty campaign overview for Alliance Auth.AA Sovereignty TimerScreenshotsAA Sov Timer Dashboard (Light Mode)AA Sov Timer Dashboard (Dark Mode)InstallationStep 1: Installing the AppStep 2: Update Your AA SettingsStep 3: Finalizing the InstallationStep 4: Preload Eve Universe DataStep 5: Setting up PermissionStep 6: Keep Campaigns UpdatedUpdatingChangelogTranslation StatusContributingScreenshotsAA Sov Timer Dashboard (Light Mode)AA Sov Timer Dashboard (Dark Mode)InstallationImportant: Please make sure you meet all preconditions before you proceed:AA Sovereignty Timer is a plugin for Alliance Auth. If you don't have Alliance Auth running already, please install it first before proceeding. (see the officialAA installation guidefor details)AA Sovereignty Timer needs the appdjango-eveuniverseto function. Please make sure it is installed before continuing.Step 1: Installing the AppMake sure you're in the virtual environment (venv) of your Alliance Auth installation. Then install the latest version:pipinstallaa-sov-timerStep 2: Update Your AA SettingsConfigure your AA settings (local.py) as follows:Add'eveuniverse',toINSTALLED_APPSAdd'sovtimer',toINSTALLED_APPSRestart your supervisorStep 3: Finalizing the InstallationCopy static files and run migrationspythonmanage.pycollectstaticpythonmanage.pymigrateStep 4: Preload Eve Universe DataAA Sovereignty Timer uses Eve Universe data to map IDs to names for solar systems, regions and constellations. So you need to preload some data from ESI once. If you already have run this command, you can skip this step.pythonmanage.pyeveuniverse_load_datamappythonmanage.pysovtimer_load_initial_dataBoth commands might take a moment or two, so be patient ...Step 5: Setting up PermissionNow you can set up permissions in Alliance Auth for your users. Addsovtimer | Sovereignty Timer | Can access the Sovereignty Timer moduleto the states and/or groups you would like to have access.Step 6: Keep Campaigns UpdatedAdd the following scheduled task to yourlocal.py. One done, restart your supervisor.# AA Sovereignty Timer - Run sovereignty related updates every 30 secondsCELERYBEAT_SCHEDULE["sovtimer.tasks.run_sov_campaign_updates"]={"task":"sovtimer.tasks.run_sov_campaign_updates","schedule":30.0,}Now your system is updating the sovereignty campaigns every 30 seconds.UpdatingTo update your existing installation of AA Sovereignty Timer, first enable your virtual environment.Then run the following commands from your AA project directory (the one that containsmanage.py).pipinstall-Uaa-sov-timerpythonmanage.pycollectstaticpythonmanage.pymigrateFinally, restart your AA supervisor services.ChangelogSeeCHANGELOG.mdTranslation StatusDo you want to help translate this app into your language or improve the existing translation? -Join our team of translators!ContributingDo you want to contribute to this project? That's cool!Please make sure to read theContribution Guidelines.(I promise, it's not much, just some basics)
aas-package
aas_zohocrm_repo
aas-r
Faster Get Requests with AIOHTTP> Used in Ultron-Bot to interact with APIs Faster> Easy to Use, Fast and Asynchronous> No Docs
aa-srp
AA SRPSRP Module forAlliance AuthAA SRPOverviewFeaturesScreenshotsDashboardDashboard (View All)Your SRP RequestsSRP Requests OverviewSRP Request DetailsInstallationStep 1: Install the PackageStep 2: Configure Alliance AuthStep 3: Finalizing the InstallationStep 4: Preload Eve Universe DataStep 5: Setting up PermissionsStep 6: (Optional) Import From Built-in SRP ModulePermissionsChangelogTranslation StatusContributingOverviewFeaturesOverview of SRP linksOverview of your own SRP requests and their statusAccepting kill mails fromzKillboardandEveTools KillboardSRP Request administration is mostly done via ajax and without page reloadsUse modern DataTables with filters where ever they're usefulTables fully searchable and sortableMandatory reason on SRP rejectNotifications in AA with detailed information on SRP rejectionDiscord notification via PM to the user on SRP request approval or rejection, if eitherAA-Discordbot,Discord NotifyorDiscord Proxyis installedNotify your SRP team (optional) in their Discord channel about new SRP requests, ifAA-DiscordbotorDiscord Proxyis installedScreenshotsDashboardDashboard (View All)Your SRP RequestsSRP Requests OverviewSRP Request DetailsInstallationImportant: Please make sure you meet all preconditions before you proceed:AA SRP is a plugin for Alliance Auth. If you don't have Alliance Auth running already, please install it first before proceeding. (see the officialAlliance Auth installation guidefor details)AA SRP needsEve Universeto function. Please make sure it is installed, before continuing.Step 1: Install the PackageMake sure you're in the virtual environment (venv) of your Alliance Auth installation Then install the latest release directly from PyPi.pipinstallaa-srpStep 2: Configure Alliance AuthThis is fairly simple, just add the following to theINSTALLED_APPSof yourlocal.pyConfigure your AA settings (local.py) as follows:Add"eveuniverse",toINSTALLED_APPSAdd"aasrp",toINSTALLED_APPSStep 3: Finalizing the InstallationRun static files collection and migrationspythonmanage.pycollectstatic pythonmanage.pymigrateRestart your supervisor services for AuthStep 4: Preload Eve Universe DataAA SRP utilizes the EveUniverse module, so it doesn't need to ask ESI for ship information. To set this up, you now need to run the following command.pythonmanage.pyaasrp_load_eveStep 5: Setting up PermissionsNow it's time to set up access permissions for your new SRP module. You can do so in your admin backend in the AA SRP section. Read thePermissionssection for more information about the available permissions.Step 6: (Optional) Import From Built-in SRP ModuleThis step is only needed when you have been using the built-in SRP module until now.Make sure you don't have any open SRP requests before. All SRP links in the built-in module will be closed during the import process, to make sure to not import any duplicates.The import process can be done at any given time and doesn't necessarily have to be during the installation.To import your SRP information from the built-in SRP module, run the following command.pythonmanage.pyaasrp_migrate_srp_dataPermissionsIDDescriptionNotesbasic_accessCan access the AA SRP moduleYour line members should have this permission.create_srpCan create new SRP linksYour FCs should have this permission.manage_srpCan manage SRPUsers with this permission can manage the AA SRP Module. Like changing and removing SRP links and requests.manage_srp_requestsCan manage SRP requestsUsers with this permission can manage the SRP requests. Like changing and removing SRP requests.ChangelogSeeCHANGELOG.mdTranslation StatusDo you want to help translate this app into your language or improve the existing translation? -Join our team of translators!ContributingYou want to contribute to this project? That's cool!Please make sure to read theContribution Guidelines.(I promise, it's not much, just some basics)
aassdd
Failed to fetch description. HTTP Status Code: 404
aassddee
No description available on PyPI.
aa-standingsrequests
Standings RequestsApp for managing character standing requests, made forAlliance Auth.ContentsFeaturesScreenshotsInstallationSettingsPermissionsStandings RequirementsManual for Standing ManagersHistoryChange LogFeaturesUser can requests standings for their characters and corporationsStanding managers can approve / deny standings requests from usersAutomatic verification that approved / revoked standings are added / removed in-gameWhen user leaves the corporation or alliance, the app will automatically suggest to revoke the standing in-gameEither an alliance or a corporation can be defined as master for in-game standingsReview current standings of your alliance or corporation and see which auth user requested it (if any)Approvals/revocations are automatically logged for later reviewScreenshotsHere are some example screenshots:Requesting standings for a characterReviewing standings requestsInstallationStep 1 - Check prerequisitesStandings Requests is a plugin for Alliance Auth. If you don't have Alliance Auth running already, please install it first before proceeding. (see the officialAA installation guidefor details)Standings Requests needs the appdjango-eveuniverseto function. Please make sure it is installed, before before continuing.Step 2 - Update Eve Online scopesAdd the following scopes to the Eve Online app used by Auth ondevelopers.eveonline.com:esi-alliances.read_contacts.v1 esi-corporations.read_contacts.v1Step 3 - Python installationActivate your virtual environment and install this app with:pipinstallaa-standingsrequestsStep 4 - Django InstallationAdd'standingsrequests'toINSTALLED_APPSin your Alliance Auth local settings file. Also add the other settings from theSettings Exampleand update the example config for your alliance.The most important part of the settings isSTANDINGS_API_CHARID, which need to be the Eve Online ID of the character that will be used to sync standings with your alliance or corporation.Run database migrations:pythonmanage.pymigratestandingsrequestsCopy static files to your webserver:pythonmanage.pycollectstaticFinally restart Django and Celery.Step 5 - Setup app within AuthOpen the standingsrequests app in Alliance Auth and add the token for the configured standings character. This will initiate the first pull of standings. You will get a notification once the standings pull is completed (Usually within a few minutes).Last, but not least make sure to addpermissionsto groups / states as required to make the new app available to users.That's it, you should be ready to roll.Settings ExampleHere is a complete example of all settings that goes into your local settings file.# id of character to use for updating alliance contactsSTANDINGS_API_CHARID=1234STR_CORP_IDS=[CORP1ID,CORP2ID,...]STR_ALLIANCE_IDS=[YOUR_ALLIANCE_ID,...]# This is a map, where the key is the State the user is in# and the value is a list of required scopes to checkSR_REQUIRED_SCOPES={'Member':['publicData'],'Blue':[],'':[]# no state}# CELERY tasksCELERYBEAT_SCHEDULE['standings_requests_standings_update']={'task':'standings_requests.standings_update','schedule':crontab(minute='*/30'),}CELERYBEAT_SCHEDULE['standings_requests_update_associations_api']={'task':'standings_requests.update_associations_api','schedule':crontab(minute='30',hour='*/3'),}CELERYBEAT_SCHEDULE['standings_requests_validate_requests']={'task':'standings_requests.validate_requests','schedule':crontab(minute='0',hour='*/6'),}CELERYBEAT_SCHEDULE['standings_requests_purge_stale_data']={'task':'standings_requests.purge_stale_data','schedule':crontab(minute='0',hour='*/24'),}SettingsHere is a brief explanation of all available settings:NameDescriptionDefaultSR_CORPORATIONS_ENABLEDswitch to enable/disable ability to request standings for corporationsTrueSR_NOTIFICATIONS_ENABLEDSend notifications to users about the results of standings requests and standing changes of their charactersTrueSR_OPERATION_MODESelect the entity type of your standings master. Can be:"alliance"or"corporation""alliance"SR_REQUIRED_SCOPESmap of required scopes per state (Mandatory, can be [] per state)-SR_PAGE_CACHE_SECONDSNumber of seconds to cache heavy pages like character and groups standing. Set to 0 to disable.600SR_STANDINGS_STALE_HOURSStanding data will be considered stale and removed from the local database after the configured hours. The latest standings data will never be purged, no matter how old it is48SR_STANDING_TIMEOUT_HOURSMax hours to wait for a standing to be effective after being marked actioned. Non effective standing requests will be reset when this timeout expires.24SR_SYNC_BLUE_ALTS_ENABLEDAutomatically sync standing of alts known to Auth that have standing in gameTrueSTANDINGS_API_CHARIDEve Online ID of character to use for fetching alliance contacts from ESI (Mandatory)-STR_ALLIANCE_IDSEve Online ID of alliances. Characters belonging to one of those alliances are considered "in organization". Your main alliance goes here when in alliance mode. (Mandatory, can be [])-STR_CORP_IDSEve Online ID of corporations. Characters belonging to one of those corporations are considered "in organization". Your main corporation goes here when in corporation mode. (Mandatory, can be [])-PermissionsThese are all relevant permissions:NameDescriptionabstract standings request - User can request standingsThis is the permission required to have basic access to this app and be able to request and maintain blue standings without them being revoked. IMPORTANT: When a user no longer has this permission all of their standings will be revoked.contact set - User can view standingsSee which mains the character and corporation standing have been requested by. Typically you'll probably only want standings managers to have this.abstract standings request - User can process standings requestsUser can see standings requests and process/approve/reject them.contact set - User can export standing requestsUser can download all of the standings data, including main character associations, as a CSV file. Useful if you want to do some extra fancy processing in a spreadsheet or something.Standings RequirementsThese are the requirements to be able to request and maintain blue standings. If a character or account falls out of these requirement scopes then their standing(s) will be revoked.Request TypeRequirementsCharacter• Character has been added to Auth and is owned by the requesting user.• User has therequest_standingspermissions.Corporation• All member characters of the corporation have been added to Auth and are owned by the requesting user• User has therequest_standingspermission.Note that all characters need to maintain valid tokens in Auth or there standing will automatically be revoked.Manual for Standing ManagersStanding managers have the ability to review standings requests on the "Manage Requests" page.Standings RequestsStandings Requests are fairly straightforward, there are two options:RejectReject the standings request, effectively deleting it. The user will be able to request it again however.ActionedThe requested standing has been actioned/changed in game. The system then expects to see this request become effective within 24 hours. If it does not show up in a standings API pull within 24 hours the actioned flag is removed and it will show up as a standings request again.Once a standing is actioned it will be maintained as an "effective" standings request. If the standing is removed in game while it is still valid in the system then it will become an active request again.Standings RevocationsStandings will show up here in one of two situations:The user has deleted the standings request for that contact, indicating they no longer require the standing.The user is no longer eligible to hold active standings.Currently it is not indicated which of these two cases (or which automatic revocation case) triggered the standing revocation.DeleteMake sure you fully understand delete before using it, you will usually use one of the other two options instead of delete. When you delete a standings requestit is literally deleted. The system will no longer attempt to manage this request or verify that it has been revoked etc.The standing becomes "unmanaged".UndoTurns the standing revocation into a standings request again. Useful if someone got booted from corp or auth temporarily. If they still don't have the requirements met the next time a validation pass happens then it will be turned into a revocation again.ActionedSame as for Standings Requests. The system will hold the revocation in the background until it sees it removed in game. If the standing has still not been unset (or set to neutral or below) in 24 hours then it will appear as a standings revocation again.Management CommandsThis app comes with management commands that provide special features for admins.You can run any management command from the command. Make sure you are in the folder that also containsmanage.pyand that you have activate your venv:pythonmanage.pyNAME_OF_COMMANDstandingsrequests_sync_blue_altsThis command automatically creates accepted standing requests for alt characters on Auth that already have blue standing in game. This can be useful when this app is first installed to avoid having all users manually request standing for alts that are already blue.Standings created by this command will not have an actioner name set.HistoryThis is a fork ofBasraah's standingrequests. Big thanks to Basraah for all his effort in developing the initial version.
aa-standingssync
Standings SyncAlliance Auth app for cloning alliance standings and war targets to alts.ContentFeaturesScreenshotHow it worksInstallationUpdatingSettingsPermissionsAdmin FunctionsFeedbackChange LogFeaturesThe main purpose of this app is to enable non-alliance characters to have the same standings view of other pilots in game as their alliance main. This e.g. allows non-alliance scouts to correctly report blues and non-blues. Or JF pilots can see which other non-alliance characters on grid not blues and therefore a potential threat.Here is an high level overview of the main features:Synchronize alliance contacts to chosen non-alliance charactersSupports coalition usage with multiple alliances in the same Alliance Auth installationSynchronize alliance war targets as contacts with terrible standingAutomatically deactivates synchronization when a user ceases to be eligible (e.g. main left the alliance)ScreenshotHere is a screenshot of the main screen.How it worksTo enable non-alliance members to use alliance standings the personal contact of that character are replaced with the alliance contacts.InstallationStep 1 - Check PreconditionsPlease make sure you meet all preconditions before proceeding:Standings Sync is a plugin forAlliance Auth. If you don't have Alliance Auth running already, please install it first before proceeding. (see the officialAA installation guidefor details)Standings Sync needs the appdjango-eveuniverseto function. Please make sure it is installed, before continuing.Step 2 - Install appInstall into AA virtual environment with PIP install from PyPI:pipinstallaa-standingssyncStep 3 - Update Eve Online appUpdate the Eve Online app used for authentication in your AA installation to include the following scopes:esi-characters.read_contacts.v1 esi-characters.write_contacts.v1 esi-alliances.read_contacts.v1Step 4 - Configure AA settingsConfigure your AA settings (local.py) as follows:Add'standingssync'toINSTALLED_APPSAdd these lines add to bottom of your settings file:# settings for standingssyncCELERYBEAT_SCHEDULE['standingssync.run_regular_sync']={'task':'standingssync.tasks.run_regular_sync','schedule':crontab(minute=0,hour='*/2')}Please also see thesettingssection for more configuration options. For example a setting is required to enable syncing war targets.Step 5 - Finalize installation into AARun migrations & copy static filespythonmanage.pymigrate pythonmanage.pycollectstatic--noinputRestart your supervisor services for AAStep 6 - Setup permissionsNow you can access Alliance Auth and setup permissions for your users. See section "Permissions" below for details.Step 7 - Setup alliance characterFinally you need to set the alliance character that will be used for fetching the alliance contacts / standing. Just click on "Set Alliance Character" and add the requested token. Note that only users with the appropriate permission will be able to see and use this function.Once an alliance character is set the app will immediately start fetching alliance contacts. Wait a minute and then reload the page to see the result.That's it. The Standing Sync app is fully installed and ready to be used.UpdatingTo update your existing installation of Alliance Freight first enable your virtual environment.Then run the following commands from your AA project directory (the one that containsmanage.py).pipinstall-Uaa-standingssyncpythonmanage.pymigratepythonmanage.pycollectstatic--noinputFinally restart your AA supervisor services.SettingsHere is a list of available settings for this app. They can be configured by adding them to your AA settings file (local.py). If they are not set the defaults are used.NameDescriptionDefaultSTANDINGSSYNC_ADD_WAR_TARGETSWhen enabled will automatically add or set war targets with standing = -10 to synced characters.FalseSTANDINGSSYNC_CHAR_MIN_STANDINGMinimum standing a character needs to have in order to get alliance contacts. Any char with a standing smaller than this value will be rejected. Set to0.0if you want to allow neutral alts to sync.0.1STANDINGSSYNC_REPLACE_CONTACTSWhen enabled will replace contacts of synced characters with alliance contacts.TrueSTANDINGSSYNC_STORE_ESI_CONTACTS_ENABLEDWether to store contacts received from ESI to disk. This is for debugging.FalseSTANDINGSSYNC_SYNC_TIMEOUTDuration in minutes after which a delayed sync for managers and characters is reported as down. This value should be aligned with the frequency of the sync task.180STANDINGSSYNC_WAR_TARGETS_LABEL_NAMEName of EVE contact label for war targets. Needs to be created by the user for each synced character. Required to ensure that war targets are deleted once they become invalid. Not case sensitive.WAR TARGETSPermissionsThis app only uses two permission. One for enabling this app for users and one for enabling users to add alliances for syncing.NamePurposeCodeCan add synced characterEnabling the app for a user. This permission should be enabled for everyone who is allowed to use the app (e.g. Member state)add_syncedcharacterCan add alliance managerEnables adding alliances for syncing by setting the character for fetching alliance contacts. This should be limited to users with admins / leadership privileges.add_syncmanagerAdmin functionsAdmins will find a "Standings Sync" section on the admin page. This section provides the following features:See a list of all setup alliances with their sync statusSee a list of all enabled characters with their current sync statusManually remove characters / alliances from syncManually start the sync process for characters / alliancesFeedbackIf you encounter any bugs or would like to request a new feature please open an issue in this gitlab repo.
aa-statistics
AA-StatisticsAA-Statistics forAlliance Auth.Periodically gather and update statistics for use by other modulesCurrently used byAA-DiscordBotAA-SmartGroupsCurrently offersZKillboardStatsSetuppip install git+https://github.com/pvyParts/aa-statistics.gitadd'aastatistics',to INSTALLED_APPS in yourlocal.pymigrate database and restart authpythonmanage.pymigrate pythonmanage.pycollectstaticAdd the following lines to your local.py## Settings for AA-StatisticsMEMBER_ALLIANCES=[111,222,333]# Alliances you care about statistics for## Periodic Tasks for AA-StatisticsCELERYBEAT_SCHEDULE['aastatistics.run_stat_model_update']={'task':'aastatistics.run_stat_model_update','schedule':crontab(minute=0,hour=0,)}IssuesPlease remember to report any AA-Statistics related issues using the issues onthisrepository.ContributeAll contributions are welcome, but please if you create a PR for functionality or bugfix, do not mix in unrelated formatting changes along with it.
aas-test-engines
Test Engines for the Asset Administration ShellThe Asset Administration Shell (AAS) is a standard for Digital Twins. More information can be foundhere.The tools in this repository offer measures to validate compliance of AAS implementations against the AAS standard.InstallationYou can install the AAS Test Engines via pip:python-mpipinstallaas_test_enginesCheck AAS Type 1 (File)Check AASX:fromaas_test_enginesimportfilefromxml.etreeimportElementTreewithopen('aas.aasx')asf:result=file.check_aasx_file(f)# result.ok() == Trueresult.dump()Check JSON:fromaas_test_enginesimportfile# Check filewithopen('aas.json')asf:result=file.check_json_file(f)# result.ok() == True# Or check data directlyaas={'assetAdministrationShells':[],'submodels':[],'conceptDescriptions':[]}result=file.check_json_data(aas)# result.ok() == Trueresult.dump()Check XML:fromaas_test_enginesimportfilefromxml.etreeimportElementTree# Check filewithopen('aas.xml')asf:result=file.check_xml_file(f)# result.ok() == True# Or check data directlydata=ElementTree.fromstring('<environment xmlns="https://admin-shell.io/aas/3/0" />')result=file.check_xml_data(aas)# result.ok() == Trueresult.dump()Checking older versionsBy default, thefile.check...methods check compliance to version 3.0 of the standard. You may want to check against older versions by passing a string containing the version to these methods.You can query the list of supported versions as follows:fromaas_test_enginesimportfileprint(file.supported_versions())print(file.latest_version())Check AAS Type 2 (HTTP API)Check a running server instancefromaas_test_enginesimportapitests=api.generate_tests()# Check an instanceapi.execute_tests(tests,"http://localhost")# Check another instanceapi.execute_tests(tests,"http://localhost:3000")Checking older versions and specific test suitesBy default, theapi.generate_testsmethod generate test cases for version 3.0 of the standard and all associated test suites. You may want to check against older versions by passing a string containing the version to these methods. You can also provide a list of test suites to check against:fromaas_test_enginesimportapitests=api.generate_tests('1.0RC03',['repository'])api.execute_tests(tests,"http://localhost")You can query the list of supported versions and their associated test suites as follows:fromaas_test_enginesimportapiprint(api.supported_versions())print(api.latest_version())For version 1.0RC03 the following test suites are available:API NameTest Suite ReadTest Suite Read and WriteAsset Administration Shell APIaas_readaasSubmodel APIsubmodel_readsubmodelAASX File Server APIaasx_readaasxAsset Administration Shell Registry APIaas_registry_readaas_registrySubmodel Registry APIsubmodel_registry_readsubmodel_registryAsset Administration Shell Repository APIaas_repository_readaas_repositorySubmodel Repository APIsubmodel_repository_readsubmodel_repositoryConcept Description Repository APIconcept_description_repository_readconcept_description_repositoryAsset Administration Shell Basic Discovery APIaas_discovery_readaas_discvoerySerialization APIserialization-Description APIdescription-Command line interfaceYou may want to invoke the test tools using the simplified command line interface:# Check filepython-maas_test_enginesfiletest.aasx# Check serverpython-maas_test_enginesapihttps://localhost--suiteregistry
aastex
aastexThis Python library extendsPyLaTeXto support theAASTeX LaTeX package.Installationaastexis available on PyPI and can be installed usingpip:pipinstallaastex
aasthaadvancepackage
This is long Description
aas-timeseries
Experimental package to make interactive visualizations for time series, for AAS PublicationsLicenseThis project is Copyright (c) Thomas Robitaille and licensed under the terms of the BSD 3-Clause license. This package is based upon theAstropy package templatewhich is licensed under the BSD 3-clause licence. See the licenses folder for more information.ContributingWe love contributions! aas-timeseries is open source, built on open source, and we’d love to have you hang out in our community.Imposter syndrome disclaimer: We want your help. No, really.There may be a little voice inside your head that is telling you that you’re not ready to be an open source contributor; that your skills aren’t nearly good enough to contribute. What could you possibly offer a project like this one?We assure you - the little voice in your head is wrong. If you can write code at all, you can contribute code to open source. Contributing to open source projects is a fantastic way to advance one’s coding skills. Writing perfect code isn’t the measure of a good developer (that would disqualify all of us!); it’s trying to create something, making mistakes, and learning from those mistakes. That’s how we all improve, and we are happy to help others learn.Being an open source contributor doesn’t just mean writing code, either. You can help out by writing documentation, tests, or even giving feedback about the project (and yes - that includes giving feedback about the contribution process). Some of these contributions may be the most valuable to the project as a whole, because you’re coming to the project with fresh eyes, so you can see the errors and assumptions that seasoned contributors have glossed over.This disclaimer was originally written byAdrienne Lowefor aPyCon talk, and was adapted by aas-timeseries based on its use in the README file for theMetPy project.
aa-stripe
Stripe integration for Django-based projectsThis project’s target is to make the Stripe API easier to use in Django-based applications. At the moment the library supports:charging usersplanssubscriptionswebhooksSupport for Python 2.7 has been dropped since aa-stripe 0.6.0.InstallationAddaa_stripeto your app’sINSTALLED_APPS, and also setSTRIPE_API_KEYin project settings. After all please migrate the app (./manage.py migrate aa_stripe). AddSTRIPE_WEBHOOK_ENDPOINT_SECRETinto your settings from stripe webhooks configuration to enable webhooks. AddSTRIPE_USER_MODELif it is different than settings.AUTH_USER_MODEL. In example when CC is connected to office not person.STRIPE_USER_MODELdefaults to AUTH_USER_MODEL.Addaa_stripe.api_urlsinto your url conf.UsageCreating a token for userUse stripe.js (https://stripe.com/docs/stripe.js) to get single use token (stripe.js->createToken) and send it to API using/aa-stripe/customersto create Customer for the user. It runs:customer = stripe.Customer.create(source=data["id"]) # data is the response dictionary from Stripe API (in front-end) token = StripeToken.objects.create(user=request.user, content=data, customer_id=customer["id"])This endpoint requires authenticated user. In case you need diferent implementation (like one call with register) you’ll have to adjust your code.ChargingFirst of all, make sure to obtain Stripe user token from the Stripe API, and then save it toaa_stripe.models.StripeToken, for example:import stripe from aa_stripe.models import StripeToken customer = stripe.Customer.create(source=data["id"]) # data is the response dictionary from Stripe API (in front-end) token = StripeToken.objects.create(user=request.user, content=data, customer_id=customer["id"])To charge users, create an instance ofaa_stripe.models.StripeChargemodel and then call thecharge()method:c = StripeCharge.objects.create(user=user, token=token, amount=500, # in cents description="Charge for stuff", # sent to Stripe comment="Comment for internal information", statement_descriptor="My Company" # sent to Stripe ) c.charge()Upon successfull charge also sends signal,stripe_charge_succeededwith instance as single parameter.If charge fails due to CardError,charge_attept_failedis set to True and this charge will not be automatically retried bycharge_stripecommand. Signalstripe_charge_card_exceptionwith instance and exception will be send.There is also a management command calledcharge_stripein case you need to process all the remaining charges or to run it by cron.Subscriptions supportWith Stripe user token already obtained you can create subscription.import stripe from aa_stripe.models import StripeSubscription subscription = StripeSubscription.objects.create( customer=self.customer, user=self.user, plan=self.plan, metadata={"name": "test subscription"}, )The newly created object is not sent to Stripe just yet.subscription_data = subscription.create_at_stripe()The command above returns whole subscription data send by stripe, including, in example, discounts.https://stripe.com/docs/api#subscriptionsUtility functions for subscriptionssubscription.refresh_from_stripe() - gets updated subscription data from Stripe. Example usage: parsing webhooks - when webhook altering subscription is received it is good practice to verify the subscription at Stripe before making any actions.subscription.cancel() - cancels subscription at Stripe.StripeSubscription.get_subcriptions_for_cancel() - returns all subscriptions that should be canceled. Stripe does not support end date for subscription so it is up the user to implement expiration mechanism. Subscription has end_date that can be used for that.StripeSubscription.end_subscriptions() - cancels all subscriptions on Stripe that has passed end date. Use with caution, check internal comments.management command: end_subscription.py. Terminates outdated subscriptions in a safe way. In case of error returns it at the end, using Sentry if available or in console. Should be used in cron script. By default sets at_period_end=True.Subscription PlansSubscription Plans can be created using Stripe UI however there are cases when those needs to be created using API.import stripe from aa_stripe.models import StripeSubscriptionPlan plan = StripeSubscriptionPlan.objects.create( source={"a": "b"}, amount=5000, name="gold-basic", interval=StripeSubscriptionPlan.INTERVAL_MONTH, interval_count=3, )As with Subscription, the object has to be sent to stripe.plan_data = plan.create_at_stripe()The command above returns whole plan data send by stripe.https://stripe.com/docs/api#plansCoupons SupportStripe coupons can be created both in the Stripe Dashboard and using theaa_stripe.models.StripeCouponmodel, and also if webhooks are properly configured in your app, you will be able to see all changes related to coupons made in the Stripe Dashboard. This works both ways, if a coupon was created, edited or deleted on the application side, the list of coupons in Stripe will be updated respectively.from aa_stripe.models import StripeCoupon coupon = StripeCoupon.objects.create( coupon_id="SALE10", duration=StripeCoupon.DURATION_FOREVER, currency="usd", amount_off=10, # in dollars ) # coupon was created at Stripe coupon.delete() # coupon was deleted from Stripe, but the StripeCoupon object is kept print(coupon.is_deleted) # TrueImportant:When updating coupon data, do not use theStripeCoupon.objects.update()method, because it does not call theStripeCoupon.save()method, and therefore the coupon will not be updated at Stripe.The refresh_coupons management commandTo make sure your app is always up to date with Stripe, therefresh_couponsmanagement command should be run chronically. It allows to periodically verify if all coupons are correctly stored in your app and no new coupons were created or deleted at Stripe.For more information about coupons, see:https://stripe.com/docs/api#couponsWebhooks supportAll webhooks should be sent to/aa-stripe/webhooksurl. AddSTRIPE_WEBHOOK_ENDPOINT_SECRETto your settings to enable webhook verifications. Each received webhook is saved as StripeWebhook object in database. User need to add parsing webhooks depending on the project. Be advised. There might be times that Webhooks will not arrive because of some error or arrive in incorrect order. When parsing webhook it is also good to download the refered object to verify it’s state.Stripe has the weird tendency to stop sending webhooks, and they have not fixed it yet on their side. To make sure all events have arrived into your system, thecheck_pending_webhooksmanagement command should be run chronically. In case there is more pending webhooks than specified in theSTRIPE_PENDING_WEBHOOKS_THRESHOLDvariable in your settings (default:20), an email to project admins will be sent with ids of the pending events, and also the command will fail raising an exception, so if you have some kind of error tracking service configured on your servers (for example:Sentry), you will be notified. Also ifENV_PREFIXis specified in your settings file, it will be included in the email to admins to indicate on which server the fail occurred.By default the site used in thecheck_pending_webhookscommand is the firstdjango.contrib.sites.models.Siteobject from the database, but in case you need to use some other site, please use the--siteparameter to pass your site’s id.Parsing webhooksTo parse webhooks, you can connect to theaa_stripe.models.webhook_pre_parsesignal, which is sent each time aStripeWebhookobject is parsed.Sample usage:from aa_stripe.models import StripeWebhook, webhook_pre_parse def stripewebhook_pre_parse(sender, instance, event_type, event_model, event_action, **kwargs): if not instance.is_parsed: # parse webhook_pre_parse.connect(stripewebhook_pre_parse, sender=StripeWebhook)Arguments:sender - theStripeWebhookclassinstance - theStripeWebhookevent objectevent_type - Stripe event type (for example:coupon.created,invoice.payment_failed,ping, etc., see:https://stripe.com/docs/api#event_types)event_model - the model which created the event (for example:coupon,invoice,charge.dispute, etc.)event_action - the action done on theevent_model(for example:created,updated,payment_failed, etc.)Bothevent_modelandevent_actionequal toNoneifevent_typeis apingevent.Updating customer card dataStripeCustomer.sources list is updated after receiving Webhook from Stripe about updating the customer object. It is a list ofStripe sourceobjects.Another way of updating the credit card information is to run therefresh_customersmanagement command in cron.SupportDjango 2.2-3.2Python 3.6-3.9
aa-structures
StructuresApp for managing Eve Online structures with Alliance Auth.OverviewThis app is for managing Eve Online structures withAlliance Auth. It allows all member corporations to see a current list of all their structures in Auth. In addition it allows forwarding Eve Online notifications to Discord.FeaturesStructures adds the following main features to Alliance Auth:Structure browser with a detailed list of all structures owned by member corporations, automatically synced with the game serverStructures include all Upwell structures, Custom Offices and Starbases / POSesAutomatically forwards Eve Online notifications to Discord channels as alerts for these categories:Upwell structuresCustoms officesStarbasesMoon miningSovereigntyWarsCorporation membership changesAutomatically adds timers from relevant notifications toAlliance Auth Structure TimersorStructure Timers IIapp (if installed)Fittings, contents of Upwell structures (e.g. fuel blocks and ammo) and current fuel usageAdditional structure notifications generated by the app:Configurable alerts to warn about fuel running at in structures and POSes, and which can replace the ESI notificationsNotifications to inform when structures have been refueled (BETA)Configurable alerts to warn about jump fuel (liquid ozone) running low in jump gatesPOS reinforced (BETA)Tax rates and access settings of Customs OfficesPermissions define which structures are visible to a user based on organization membershipSelf-defined tags help to better organize structuresAbility to increase notification response time and sync resilience with multiple sync characters per structure ownerAutomatically sends alerts to users and admin when token become invalid or sync from ESI failsInterface for 3rd party monitoring of the services statusChinese :cn:, English :us:, German :de: and Russian :ru: localizationPlease also check outDiscord Notifythat will automatically forward auth notifications to the respective users on Discord.ScreenshotsHere is an example for the main list of structures:You can also see the fittings and contents any upwell structures:The tax rate and access configuration of customs offices is visible too:And you can see the configuration for starbases:This is an example for a notification posted on Discord:InstallationPlease see theoperation manualfor the installation guide.
aa-structuretimers
Structure Timers IIAn app for keeping track of Eve Online structure timers with Alliance Auth and Discord.ContentsOverviewScreenshotsInstallationSettingsNotification RulesPermissionsManagement commandsChange LogOverviewStructure Timers IIis an enhanced version of the Alliance Auth's Structure Timers app, with many additional useful features and an improved UI. Here is a overview of it's main features in comparison to Auth's basic variant.FeatureAuthStructure Timer IICreate and edit timers for structuresxxSee all pending timers at a glance and with live countdownsxxRestrict timer access to your corporationxxRestrict ability to create and delete timers to certain usersxxGet automatic notifications about upcoming timers on Discord-xDefine a timer type (e.g. armor or hull)-xRestrict timer access to your alliance-xRestrict timer access to people with special clearance ("OPSEC")-xAdd screenshots to timers (e.g. with the structure's fitting)-xCreate timers more quickly and precisely with autocomplete for solar system and structure types-xFind timers more quickly with filters and full text search-xAutomatic cleanup of elapsed timers-xHintIf you like to see all timers in a calendar view please consider checking out the amazing appAllianceauth Opcalendar, which is fully integrated withStructure Timers II.ScreenshotsList of timersDetails for a timerCreating a new timerNotification on DiscordInstallationStep 1 - Check PreconditionsPlease make sure you meet all preconditions before proceeding:Structure Timers is a plugin forAlliance Auth. If you don't have Alliance Auth running already, please install it first before proceeding. (see the officialAA installation guidefor details)Structure Timers needs the appdjango-eveuniverseto function. Please make sure it is installed, before continuing.Note that Structure Timers is compatible with Alliance Auth's Structure Timer app and can be installed in parallel.Step 2 - Install appMake sure you are in the virtual environment (venv) of your Alliance Auth installation. Then install the newest release from PyPI:pipinstallaa-structuretimersStep 3 - Configure settingsConfigure your Auth settings (local.py) as follows:Add'structuretimers'toINSTALLED_APPSAdd the following lines to your settings file:CELERYBEAT_SCHEDULE['structuretimers_housekeeping']={'task':'structuretimers.tasks.housekeeping','schedule':crontab(minute=0,hour=3),}Optional: Add additional settings if you want to change any defaults. SeeSettingsfor the full list.Step 4 - Finalize installationRun migrations & copy static filespythonmanage.pymigrate pythonmanage.pycollectstaticRestart your supervisor services for AuthStep 5 - Preload Eve Universe dataIn order to be able to select solar systems and structure types for timers you need to preload some data from ESI once. If you already have run those commands previously you can skip this step.Load Eve Online map:pythonmanage.pyeveuniverse_load_datamappythonmanage.pystructuretimers_load_eveYou may want to wait until the data loading is complete before starting to create new timers.Step 6 - Migrate existing timersIf you have already been using the classic app from Auth, you can migrate your existing timers over toStructure Timers II. Just run the following command:pythonmanage.pystructuretimers_migrate_timersNote: We suggest migration timers before setting up notification rules to avoid potential notification spam for migrated timers.Step 7 - Setup notification rulesIf you want to receive notifications about timers on Discord you can setup notification rules on the admin site. e.g. you can setup a rule to sent notifications 60 minutes before a timer elapses. Please seeNotification Rulesfor details.Step 8 - Setup permissionsAnother important step is to setup permissions, to ensure the right people have access features. Please seePermissionsfor an overview of all permissions.SettingsHere is a list of available settings for this app. They can be configured by adding them to your Auth settings file (local.py).Note that all settings are optional and the app will use the documented default settings if they are not used.NameDescriptionDefaultSTRUCTURETIMERS_MAX_AGE_FOR_NOTIFICATIONSWill not sent notifications for timers, which event time is older than the given minutes60STRUCTURETIMERS_NOTIFICATIONS_ENABLEDWether notifications for timers are scheduled at allTrueSTRUCTURETIMERS_TIMERS_OBSOLETE_AFTER_DAYSMinimum age in days for a timer to be considered obsolete. Obsolete timers will automatically be deleted. If you want to keep all timers, set toNone30STRUCTURETIMERS_DEFAULT_PAGE_LENGTHDefault page size for timerboard. Must be an integer value from the available options in the app.10STRUCTURETIMERS_PAGING_ENABLEDWether paging is enabled on the timerboard.TrueSTRUCTURETIMER_NOTIFICATION_SET_AVATARWether structures sets the name and avatar icon of a webhook. When False the webhook will use it's own values as set on the platform.TrueNotification RulesInStructure Timers IIyou can receive automatic notifications on Discord for timers by setting up notification rules. Notification rules allow you to define in detail what event and which kind of timers should trigger notifications.Note that in general all rules are independent from each other and all enabled rules will be executed for every timer one by one.Example setupHere is an example for a basic setup of rules:Example 1: Notify about new every newly created timer without ping (e.g. into a scouts channel)Trigger: New timer createdScheduled Time: -Webhook: YOUR-WEBHOOKPing Type: (no ping)Example 2: Notify 45 minutes before any timer elapses with ping (e.g. into the FC channel)Trigger: Scheduled timer reachedScheduled Time: T - 45 minutesWebhook: YOUR-WEBHOOKPing Type: @hereKey conceptsHere are some key concepts. For all details please see the onscreen help text when creating rules.TriggersNotifications can be triggered by two kinds of events:When a new timers is createdWhen the remaining time of timer has reached a defined threshold (e.g. 10 minutes before timer elapses)WebhooksEach rule has exactly one webhook. You can of course define multiple rules for the same webhook or define rules for different webhooks.Timer clausesAlmost every property of a timer can be used to define rules. For example you can define to get notifications only for timers which hostile objective or only for final timers.Note that setting a timer clause is optional and clauses that are not set, it will always match any.Staging systemYou can define one or multiple staging systems. Then you can see the distance in jumps and LY from your currently selected staging system to any timer (except for WH systems).Staging systems can be added or modified on the admin site under: Structure Timers/Staging Systems.PermissionsHere are all relevant permissions:CodenameDescriptiongeneral - Can access this app and see timersBasic permission required by anyone to access this app. Gives access to the list of timers (which timers a user sees can depend on other permissions and settings for a timers)general - Can create new timers and edit own timersUsers with this permission can create new timers and edit or delete their own timers.general - Can edit and delete any timerUsers with this permission can edit and delete any timer.general - Can create and see opsec timersUsers with this permission can create and view timers that are opsec restricted.Management commandsThe following management commands are available:structuretimers_load_eve: Preload all eve objects required for this app to functionstructuretimers_migrate_timers: Migrate pending timers from Auth's Structure Timers apps
aat
AATAsyncAlgoTradingaatis a framework for writing algorithmic trading strategies in python. It is designed to be modular and extensible, and is the core engine poweringAlgoCoin.It comes with support for live trading across (and between) multiple exchanges, fully integrated backtesting support, slippage and transaction cost modeling, and robust reporting and risk mitigation through manual and programatic algorithm controls.Like Zipline, the inspriation for this system,aatexposes a single strategy class which is utilized for both live trading and backtesting. The strategy class is simple enough to write and test algorithms quickly, but extensible enough to allow for complex slippage and transaction cost modeling, as well as mid- and post- trade analysis.Overviewaatis composed of 4 major parts.trading enginerisk management engineexecution enginebacktest engineTrading EngineThe trading engine initializes all exchanges and strategies, then martials data, trade requests, and trade responses between the strategy, risk, execution, and exchange objects, while keeping track of high-level statistics on the systemRisk Management EngineThe risk management engine enforces trading limits, making sure that stategies are limited to certain risk profiles. It can modify or remove trade requests prior to execution depending on user preferences and outstanding positions and orders.Execution engineThe execution engine is a simple passthrough to the underlying exchanges. It provides a unified interface for creating various types of orders.Backtest engineThe backtest engine provides the ability to run the same stragegy offline against historical data.Trading StrategyThe core element ofaatis the trading strategy interface. It is the union of theStrategyinterface, which provides methods to buy and sell, with theCallbackinterface, which provides callbacks in response to data. Users subclass this class in order to implement their strategiesCallbackclassCallback(metaclass=ABCMeta):@abstractmethoddefonTrade(self,data:MarketData):'''onTrade'''@abstractmethoddefonOpen(self,data:MarketData):'''onOpen'''@abstractmethoddefonFill(self,resp:TradeResponse):'''onFill'''@abstractmethoddefonCancel(self,data:MarketData):'''onCancel'''@abstractmethoddefonChange(self,data:MarketData):'''onChange'''@abstractmethoddefonError(self,data:MarketData):'''onError'''StrategyclassStrategy(metaclass=ABCMeta):@abstractmethoddefrequestBuy(self,callback:Callback,data:MarketData):'''requestBuy'''@abstractmethoddefrequestSell(self,callback:Callback,data:MarketData):'''requestSell'''Example StrategyHere is a simple trading strategy that buys once and holds.fromaat.strategyimportTradingStrategyfromaat.structsimportMarketData,TradeRequest,TradeResponsefromaat.enumsimportSide,OrderTypefromaat.loggingimportSTRATasslog,ERRORaselogclassBuyAndHoldStrategy(TradingStrategy):def__init__(self)->None:super(BuyAndHoldStrategy,self).__init__()self.bought=NonedefonFill(self,res:TradeResponse)->None:self.bought=reslog.info('d->g:bought%.2f@%.2f'%(res.volume,res.price))defonTrade(self,data:MarketData)->bool:ifself.boughtisNone:req=TradeRequest(side=Side.BUY,volume=1,instrument=data.instrument,order_type=OrderType.MARKET,exchange=data.exchange,price=data.price,time=data.time)log.info("requesting buy :%s",req)self.requestBuy(req)self.bought='pending'defonError(self,e)->None:elog.critical(e)defonChange(self,data:MarketData)->None:passdefonCancel(self,data:MarketData)->None:passdefonOpen(self,data:MarketData)->None:passTrading strategies have a number of required methods for handling messages:onTrade: Called when a trade occursonChange: Called when an order is modifiedonFill: Called when a strategy's trade executesonCancel: Called when an order is cancelledonError: Called when an error occursonOpen: Called when a new order occursThere are other optional callbacks for more granular processing:onStart: Called when the program startsonHalt: Called when trading is haltedonContinue: Called when trading continuesonExit: Called when the program shuts downThere are also several optional callbacks for backtesting:slippagetransactionCostonAnalyzecalled after trading engine has processed all data, used to visualize algorithm performanceSetting up and runningAn instance ofTradingStrategyclass is able to run live or against a set of historical trade/quote data. When instantiating aTradingEngineobject with aTradingEngineConfigobject, theTradingEngineConfighas atypewhich can be set to:live- live trading against the exchangesimulation- live trading against the exchange, but with order entry disabledsandbox- live trading against the exchange's sandbox instancebacktest- offline trading against historical OHLCV dataTo test our strategy in any mode, we will need to setup exchange keys to get historical data, stream market data, and make new orders.API KeysYou should creat API keys for exchanges you wish to trade on. For this example, we will assume a Coinbase Pro account with trading enabled. I usually put my keys in a set of shell scripts that are gitignored, so I don't post anything by accident. My scripts look something like:exportCOINBASE_API_KEY=...exportCOINBASE_API_SECRET=...exportCOINBASE_API_PASS=...Prior to running, I source the keys I need.SandboxesCurrently only the Gemini sandbox is supported, the other exchanges have discontinued theirs. To run in sandbox, setTradingEngineConfig.typeto Sandbox.Live TradingWhen you want to run live, setTradingEngineConfig.typeto Live. You will want to become familiar with the risk and execution engines, as these control things like max drawdown, max risk accrual, execution eagerness, etc.Simulation TradingWhen you want to run an algorithm live, but don't yet trust that it can make money, setTradingEngineConfig.typeto simulation. This will let it run against real money, but disallow order entry. You can then set things like slippage and transaction costs as you would in a backtest.TestingLet's make sure everything worked out by running a sample strategy (that doesnt make and trades!) on the Coinbase Pro exchange:python3-malgocoin--simulation--exchanges=coinbaseYou should see the following output:python3-malgocoin--simulation--exchanges=coinbase2019-06-0117:54:17,468CRITICAL--MainProcessparser.py:151--2019-06-0117:54:17,469CRITICAL--MainProcessparser.py:152--Simulationtrading2019-06-0117:54:17,469CRITICAL--MainProcessparser.py:153--2019-06-0117:54:34,570CRITICAL--MainProcesstrading.py:194--2019-06-0117:54:34,570CRITICAL--MainProcesstrading.py:195--Serverlisteningonport:80812019-06-0117:54:34,571CRITICAL--MainProcesstrading.py:196--2019-06-0117:54:34,998CRITICAL--MainProcessmarket_data.py:68--Startingalgotrading:ExchangeType.COINBASEConfigBecause there are a variety of options, a config file is generally more usable. Here is an example configuration for backtesting the Buy-and-hold strategy above on CoinbasePro:>catbacktest.cfg[general]verbose=1print=0TradingType=backtest[exchange]exchanges=coinbasecurrency_pairs=BTC/USD[strategy]strategies=aat.strategies.buy_and_hold.BuyAndHoldStrategy[risk]max_drawdown=100.0max_risk=100.0total_funds=10.0Analyzing an algorithmWe can run the above config by running:python3-malgocoin--config=./backtest.cfgWe should see the following output:2019-06-0117:58:40,173INFO--MainProcessutils.py:247--runninginverbosemode!2019-06-0117:58:40,174CRITICAL--MainProcessparser.py:165--2019-06-0117:58:40,174CRITICAL--MainProcessparser.py:166--Backtesting2019-06-0117:58:40,174CRITICAL--MainProcessparser.py:167--2019-06-0117:58:40,176CRITICAL--MainProcesstrading.py:106--Registeringstrategy:<class'aat.strategies.buy_and_hold.BuyAndHoldStrategy'>2019-06-0117:58:40,177INFO--MainProcessbacktest.py:25--Starting....2019-06-0117:58:41,338INFO--MainProcessbuy_and_hold.py:28--requestingbuy:<BTC/USD-Side.BUY:[email protected]>2019-06-0117:58:41,339INFO--MainProcessrisk.py:[email protected]:58:41,339INFO--MainProcessrisk.py:80--Riskcheckpassedforpartialorder:<BTC/USD-Side.BUY:[email protected]>2019-06-0117:58:41,339INFO--MainProcesstrading.py:244--Riskcheckpassed2019-06-0117:58:41,339INFO--MainProcesstrading.py:292--SlippageBT-<BTC/USD-Side.BUY:[email protected]>2019-06-0117:58:41,340INFO--MainProcesstrading.py:295--TXNcostBT-<BTC/USD-Side.BUY:[email protected]>2019-06-0117:58:41,340INFO--MainProcessbuy_and_hold.py:14--d->g:[email protected]:58:41,340INFO--MainProcessbacktest.py:42--<BTC/[email protected]> ...2019-06-0117:58:41,474INFO--MainProcessbacktest.py:42--<BTC/[email protected]>2019-06-0117:58:41,474INFO--MainProcessbacktest.py:33--Backtestdone,runninganalysis.This will call ouronAnalyzefunction, which in this case is implemented to plot some performance characteristics withmatplotlib.importpandasimportnumpyasnpimportmatplotlib,matplotlib.pyplotaspltimportseabornassnsmatplotlib.rc('font',**{'size':5})# extract data from trading engineportfolio_value=engine.portfolio_value()requests=engine.query().query_tradereqs()responses=engine.query().query_traderesps()trades=pandas.DataFrame([{'time':x.time,'price':x.price}forxinengine.query().query_trades(instrument=requests[0].instrument,page=None)])trades.set_index(['time'],inplace=True)# format into pandaspd=pandas.DataFrame(portfolio_value,columns=['time','value','pnl'])pd.set_index(['time'],inplace=True)# setup chartingsns.set_style('darkgrid')fig=plt.figure()ax1=fig.add_subplot(311)ax2=fig.add_subplot(312)ax3=fig.add_subplot(313)# plot algo performancepd.plot(ax=ax1,y=['value'],legend=False,fontsize=5,rot=0)# plot up/down chartpd['pos']=pd['pnl']pd['neg']=pd['pnl']pd['pos'][pd['pos']<=0]=np.nanpd['neg'][pd['neg']>0]=np.nanpd.plot(ax=ax2,y=['pos','neg'],kind='area',stacked=False,color=['green','red'],legend=False,linewidth=0,fontsize=5,rot=0)# annotate with key dataax1.set_title('Performance')ax1.set_ylabel('Portfolio value($)')forxyin[portfolio_value[0][:2]]+[portfolio_value[-1][:2]]:ax1.annotate('$%s'%xy[1],xy=xy,textcoords='data')ax3.annotate('$%s'%xy[1],xy=xy,textcoords='data')# plot trade intent/trade actionax3.set_ylabel('Intent/Action')ax3.set_xlabel('Date')ax3.plot(trades)ax3.plot([x.timeforxinrequestsifx.side==Side.BUY],[x.priceforxinrequestsifx.side==Side.BUY],'2',color='y')ax3.plot([x.timeforxinrequestsifx.side==Side.SELL],[x.priceforxinrequestsifx.side==Side.SELL],'1',color='y')ax3.plot([x.timeforxinresponsesifx.side==Side.BUY],# FIXME[x.priceforxinresponsesifx.side==Side.BUY],'^',color='g')ax3.plot([x.timeforxinresponsesifx.side==Side.SELL],# FIXME[x.priceforxinresponsesifx.side==Side.SELL],'v',color='r')# set same limitsy_bot,y_top=ax1.get_ylim()x_bot,x_top=ax1.get_xlim()ax3.set_ylim(y_bot,y_top)ax1.set_xlim(x_bot,x_top)ax2.set_xlim(x_bot,x_top)ax3.set_xlim(x_bot,x_top)dif=(x_top-x_bot)*.01ax1.set_xlim(x_bot-dif,x_top+dif)ax2.set_xlim(x_bot-dif,x_top+dif)ax3.set_xlim(x_bot-dif,x_top+dif)plt.show()We can see that our algorithm also implementedslippageandtransactionCost, resulting in a worse execution price:defslippage(self,resp:TradeResponse)->TradeResponse:slippage=resp.price*.0001# .01% price impactifresp.side==Side.BUY:# price moves against (up)resp.slippage=slippageresp.price+=slippageelse:# price moves against (down)resp.slippage=-slippageresp.price-=slippagereturnrespdeftransactionCost(self,resp:TradeResponse)->TradeResponse:txncost=resp.price*resp.volume*.0025# gdax is 0.0025 max feeifresp.side==Side.BUY:# price moves against (up)resp.transaction_cost=txncostresp.price+=txncostelse:# price moves against (down)resp.transaction_cost=-txncostresp.price-=txncostreturnrespExtendingApart from writing new strategies, this library can be extended by adding new exchanges. These are pretty simple. For cryptocurrency exchanges, I rely heavily onccxt,asyncio, and websocket libraries.ExampleHere is the coinbase exchange. Most of the code is to manage different websocket subscription options, and to convert betweenaat,ccxtand exchange-specific formatting of things like symbols, order types, etc.classCoinbaseExchange(Exchange):@lru_cache(None)defsubscription(self):return[json.dumps({"type":"subscribe","product_id":x.value[0].value+'-'+x.value[1].value})forxinself.options().currency_pairs]@lru_cache(None)defheartbeat(self):returnjson.dumps({"type":"heartbeat","on":True})deftickToData(self,jsn:dict)->MarketData:'''convert a jsn tick off the websocket to a MarketData struct'''ifjsn.get('type')=='received':returns=jsn.get('type').upper()reason=jsn.get('reason','').upper()ifs=='MATCH'or(s=='DONE'andreason=='FILLED'):typ=TickType.TRADEelifsin('OPEN','DONE','CHANGE','HEARTBEAT'):ifreason=='CANCELED':typ=TickType.CANCELelifs=='DONE':typ=TickType.FILLelse:typ=TickType_from_string(s.upper())else:typ=TickType.ERRORorder_id=jsn.get('order_id',jsn.get('maker_order_id',''))time=parse_date(jsn.get('time'))ifjsn.get('time')elsedatetime.now()iftypin(TickType.CANCEL,TickType.OPEN):volume=float(jsn.get('remaining_size','nan'))else:volume=float(jsn.get('size','nan'))price=float(jsn.get('price','nan'))currency_pair=str_to_currency_pair_type(jsn.get('product_id'))iftyp!=TickType.ERRORelsePairType.NONEinstrument=Instrument(underlying=currency_pair)order_type=str_to_order_type(jsn.get('order_type',''))side=str_to_side(jsn.get('side',''))remaining_volume=float(jsn.get('remaining_size',0.0))sequence=int(jsn.get('sequence',-1))ret=MarketData(order_id=order_id,time=time,volume=volume,price=price,type=typ,instrument=instrument,remaining=remaining_volume,side=side,exchange=self.exchange(),order_type=order_type,sequence=sequence)returnret
aat-analysis
AAT AnalysisThis package helps with analyzing mobile AAT data.Installpip install aat_analysisHow to use#%run utils.ipynb # Some utility functions#%run make_condition_templates.ipynb # Defines expected data based on resources#%run json_to_df.ipynb # Turns raw json data into dataframes and calculates responses, rts, and forcefromaat_analysis.make_condition_templatesimportmake_condition_templatesfromaat_analysis.json_to_dfimportjson_to_dffromaat_analysis.utilsimportmerge_data#from aat_analysis.Define folder pathsraw should include the raw data from your experimentexternal should include the contents of the Resources folder of your experiment appinterim and processed can be emptyexternal_folder="../data/external/"interim_folder="../data/interim/"raw_data_folder="../data/raw/"processed_data_file="../data/processed/data.csv"Preprocess data# Creates empty dataframes to define expected data for each conditiontemplates=make_condition_templates(external_folder)# Preprocesses data for each participant and moves it to interimjson_to_df(raw_data_folder,external_folder,interim_folder,templates)# Merges interim data and stores it for further analysisdata=merge_data(interim_folder,drop=['interpolated','interpolated_gyro'])data.to_csv(processed_data_file)100%|█████████████████████████████████████████████| 3/3 [00:27<00:00, 9.24s/it]AAT dataThe selected columns below contain all data needed to calculate approach tendencies for each session, participant, and stimulus type. The additional data in the dataframe (not shown) are answers to other questions and some additional AAT variables.data[['participant','condition','session','trial','is_practice','stimulus_set','stimulus','correct_response','response','accuracy','rt','force']]<style scoped> .dataframe tbody tr th:only-of-type { vertical-align: middle; }.dataframe tbody tr th { vertical-align: top; } .dataframe thead th { text-align: right; }</style>participantconditionsessiontrialis_practicestimulus_setstimuluscorrect_responseresponseaccuracyrtforce0kmahu0zqcondition_2final_session1FalseNoneNoneNoneNAFalseNaNNaN1kmahu0zqcondition_2introduction_session_21Truepractice_foodstim_0154pushNDFalseNaN8.1241862kmahu0zqcondition_2introduction_session_22Truepractice_objectsstim_1276pullpullTrue1206.012.1304663kmahu0zqcondition_2introduction_session_23Truepractice_objectsstim_1264pullNDFalseNaN1.6512794kmahu0zqcondition_2introduction_session_24Truepractice_objectsstim_1277pullpullTrue629.018.342323.......................................6166kmah8va6condition_2push_food_before_lunch_d5132Falseunhealthy_temptingstim_0025pullpullTrue346.09.0246266167kmah8va6condition_2push_food_before_lunch_d5133Falseunhealthy_non_temptingstim_0125pullpullTrue363.05.8202396168kmah8va6condition_2push_food_before_lunch_d5134Falsehealthy_non_temptingstim_0226pullpullTrue492.08.3455086169kmah8va6condition_2push_food_before_lunch_d5135Falsehealthy_temptingstim_0201pullpullTrue450.05.5394706170kmah8va6condition_2push_food_before_lunch_d5136Falseobjectsstim_1035pushpullFalse308.06.5891246171 rows × 12 columns