package
stringlengths
1
122
pacakge-description
stringlengths
0
1.3M
accesser
AccesserEnglish version一个解决SNI RST导致维基百科、Pixiv等站点无法访问的工具支持的站点使用如果不知道什么是Python从这里下载Windows一键程序,运行既可(建议关闭其他代理软件),首次使用会要求安装证书,选是即可。如果已经安装了Python 3.10*或更高版本pip3 install -U "accesser[doh,doq]"如果你不需要 DNS-over-HTTPS 和 DNS-over-QUIC,则可以不用带[doh,doq]。然后通过如下命令启动:accesser对于Windows系统,默认情况下(没有指定--notsetproxy)会设置PAC代理为http://localhost:7654/pac/?t=<随机数>,如果没有可以手动设置。此外,对于Windows系统,默认情况下(没有指定--notimportca)会自动导入证书至系统,如果没有可以手动导入,请看这里。*可以使用例如pyenv来安装所需的Python版本(推荐Python 3.11+)。设置启动一次Accesser后,会在工作目录下生成config.toml,具体含义见其中注释,保存后重新打开程序。进阶1: 与v2ray等其他代理软件一起使用Accesser是一个本地HTTP代理,默认代理地址为http://localhost:7654,只要网络流量能从其他代理软件以HTTP代理导出就能联合使用。以v2ray为例,可以添加一个HTTP的outbound指向http://localhost:7654,并设置相应的路由规则,将维基百科、Pixiv等站点的流量送到这个outbound。 并在启动 Accesser 时带上--notsetproxy参数以避免 Accesser 设置系统代理。此外,你还可以设置一个DNS outbound,然后编辑config.toml让Accesser使用这一DNS。进阶2: 增加支持的网站编辑工作目录下的pac文件(如果是一键程序,可以从GitHub下载这一文件到工作目录),使要支持的网站从代理过。然而,并不是所有站点都可以直接工作,可能需要一些调节,见如何适配站点。
access-face-vision
Access Face VisionFace detection and recognition Application.With pipInstallationpipinstallaccess-face-visionTraining/Creating FaceGrouppython-maccess_face_vision--modetrain--img_dirsamples/celebrities--face_groupcelebrities# Directory structure# **/Images/# A/# A_K_01.jpg# A_K_02.jpg# B/# B_S_01.jpg# B_S_02.jpgRunning Inferences# Live video feed>python-maccess_face_vision--modelive-video--camera-index0--camera_wait30--face_groupcelebrities# server modepython-maccess_face_vision--modeserverUseaccess-clientto make requests to this serverDocker imageDocker Image builddockerbuild-taccess_face_vision:latest.Docker run# we will use it as root directory for access_face_vision applicationmkdir-paccessai/afv# Start serverdockerrun-v$(pwd)/afv:/accessai/afvpython-maccess_face_vision--modeserver# Start camera feed processordockerrun-v$(pwd)/afv:/accessai/afvpython-maccess_face_vision--modeserverContributionContributions are welcome. Feel free to raise PRs! with any improvements.CreditFace Encoder model:https://github.com/nyoki-mtl/keras-facenet
access-github
Access GitHub緣由與動機🔑關鍵:最關鍵的動機是讓全球百萬開發者,都能免費享受分散運算 20 VMs 加速能力!驅動 Github Actions。 💣地雷:直接去用 gpt-4 開發 github api 會撞牆,寫出一堆與最新文件不相容的程式碼,因此要避開此地雷。而在日常的軟體開發過程中,我們經常需要和 GitHub 進行交互。例如,從不同的存儲庫中讀取文件,更新文件,創建和刪除文件等。雖然我們可以通過 GitHub 網站界面完成這些操作,但在某些情況下,使用命令行或 Python 腳本來進行這些操作會更加方便。 為了解決這些問題,本專案設計了一個簡單可用的 Python 庫和對應的命令行工具,以幫助開發者在不離開終端或編程環境的情況下與 GitHub 進行交互。這將大大提高我們的工作效率,使得開發過程更加流暢。安裝方法通過以下命令安裝access-github:python3 -m pip install access-github命令行工具用法使用python3 -m access_github可以使用命令行工具,下面是具體用法:python3 -m access_github [operation] --token TOKEN --url URL [other_arguments]可用操作get_tree: 獲取指定存儲庫的代碼樹read_file: 讀取指定路徑的文件內容update_file: 更新指定路徑的文件內容create: 創建指定路徑的文件或文件夾delete: 刪除指定路徑的文件或文件夾create_or_update_github_action: 創建或更新 GitHub Actions 工作流配置文件dispatch_github_action: 觸發指定的 GitHub Actions 工作流用法範例:以下是一些常見的用法示例:獲取存儲庫的代碼樹:python3 -m access_github get_tree --token TOKEN --url REPO_URL讀取指定路徑的文件內容:python3 -m access_github read_file --token TOKEN --url REPO_URL --path FILE_PATH更新指定路徑的文件內容:python3 -m access_github update_file --token TOKEN --url REPO_URL --path FILE_PATH --content NEW_CONTENT --name YOUR_NAME --email YOUR_EMAIL創建文件:python3 -m access_github create --token TOKEN --url REPO_URL --path FILE_PATH_OR_FOLDER --content FILE_CONTENT --name YOUR_NAME --email YOUR_EMAIL刪除文件:python3 -m access_github delete --token TOKEN --url REPO_URL --path FILE_PATH創建 GitHub Actions 工作流:python3 -m access_github create_or_update_github_action --token TOKEN --url REPO_URL --path WORKFLOW_YML_FILE_PATH --content WORKFLOW_YML_CONTENT觸發 GitHub Actions 工作流:python3 -m access_github dispatch_github_action --token TOKEN --url REPO_URL --workflow_yml_filename WORKFLOW_YML_FILENAME --event_type EVENT_TYPE --client_payload CLIENT_PAYLOADPython 庫用法可以通過from access_github import FUNCTION_NAME來導入需要使用的功能,如:fromaccess_githubimportget_tree,read_file,update_file,create,delete,create_or_update_github_action,dispatch_github_action函數參考get_tree(token: str, url: str) -> dict: 獲取指定存儲庫的代碼樹read_file(token: str, url: str, path: str) -> dict: 讀取指定路徑的文件內容update_file(token: str, name: str, email: str, url: str, path: str, content: str) -> dict: 更新指定路徑的文件內容create(token: str, name: str, email: str, url: str, path: str, content: str) -> dict: 創建指定路徑的文件delete(token: str, url: str, path: str) -> dict: 刪除指定路徑的文件create_or_update_github_action(token: str, url: str, path: str, content: str) -> dict: 創建或更新 GitHub Actions 工作流配置文件dispatch_github_action(token: str, url: str, workflow_yml_filename: str, event_type: str, client_payload: str) -> str: 觸發指定的 GitHub Actions 工作流函數使用範例以下是使用 Python 庫操作的示例:fromaccess_githubimportget_tree,read_file,update_file,create,deletetoken='your_token'url='https://github.com/user/repo.git'path='path/to/file.txt'# 獲取代碼樹tree=get_tree(token,url)# 讀取文件content=read_file(token,url,path)# 更新文件update_result=update_file(token,'your_name','your_email',url,path,'new_content')# 創建文件create_result=create(token,'your_name','your_email',url,'path/to/new/file','file_content')# 刪除文件delete_result=delete(token,url,'path/to/file')請根據您的需求選擇合適的函數進行操作。
accessi
Siemens Access-i Interface libraryLibrary for Siemens Access-i MR Scanner Interface to integrate and control the MR Scanner. Based on Version 1.1.2 for NX (Access-i Developer Guide)Installpip install accessiSiemens DocumentationThe library is based on this document:Access-i Dev Guide NX V1.1.2Usage guideA sample test suite (tests.py) has been created which demonstrates basic Access-i usage.The tests.py requires Access-i simulator to be running on the background, on the same local computer.The tests.py demonstrates most of the implemented methods, as well as receiving images over websocket.CollaboratingThe majority of Access-i functionality is not yet implemented here, if you need more functionality, any additions are accepted.
accessibility
accessibilityis a Python module that wraps the Accessibility API for Mac OS X. It can be used to query and modify attributes of running applications, as well as watch for a variety of notifications. The source code and several examples (in theexamplesdirectory) are hosted onGitHub.The module should compile under recent versions of both Python 2 and 3, and work with Mac OS X 10.8.x and 10.9.x. In addition, to compile the module on version 10.9.0 or later of OS X, you will need to have the Xcode IDE installed.BuildingThe module can be compiled using the traditionalpython setup.py clean build installprovided by setuptools.DocumentationThe module includes extensive docstrings, complete with examples in many cases. These can be can be browsed using Python’shelpcommand, or one can compile the Sphinx documentation. For the latter:cd docsInitialize the git submodule withgit submodule update--init--recursiveto retrieve the custom Sphinx theme.make htmland then browse the documentation indocs/_build/html.LicenseThis project is under the ISC License. See theLICENSE.txtfile for details.
accessibility-toolbar
Accessibility Toolbar (AT4N)This extension adds a toolbar to your notebook with five separate accessibility tools, focusing on support for users who are dyslexic or visually impaired. These features include the ability to make style changes to the notebook, use predefined themes, carry out some common tasks using voice control, spell check inputted text and plan out one’s work using a planner. This toolbar provides users with the tools necessary to use a Jupyter Notebook to its full potential.This project was created by @ednut15 @fabihaahmed @NorahAba @joshuazeltser @taohan16 as part of an MSc project at University College London. The original repo can be found at:Accessibility Toolbar RepositoryInstallYou can install with bower:bowerinstall--config.directory="$(jupyter--data-dir)/nbextensions"accessibility_toolbarOr clone directly from this repository:gitclonehttps://github.com/uclixnjupyternbaccessibility/accessibility_toolbar.git jupyternbextensioninstallaccessibility_toolbarTo enable the extension:jupyternbextensionenableaccessibility_toolbar/mainTo disable the extension:jupyternbextensiondisableaccessibility_toolbar/mainToolbar SummaryNotebook Style ManagerThe aim of this extension is to provide the user with the tools to customise their notebook according to their own specifications.This feature includes support for text size and font changes, line and letter spacing changes and various changes to the pages colours.There is also a predefined styles feature allowing for the saving and loading of saved page styles.Feature summary can be found at:Notebook Style ManagerNotesAll of the styles are saved when refreshing the page into localStorage.To create a predefined style choose your required styles, click on "Add new style", select a name and then click save to save your style.Beware when selecting "Default style" option as this will wipe any style changes not saved as a predefined style.Spell CheckerThe aim of this extension is to provide spell checker functionality for all markdown cells in a notebook.This feature includes the ability to be notify the user of spelling errors inline.It also provides a spell checker menu where words can be pasted and suggestions can be generated for the correct spelling of miss-spelt words.New words can be added to the dictionary there.It is also possible to switch between bold or underlining notifications of spelling mistakes.Feature summary can be found at:Spell CheckerNotesThe Spell Checker icon will turn green when it is enabled.The Spell Checker only works on markdown andNOTon code.Voice ControlThe aim of this extension is to provide voice control support for some of the common actions on a Jupyter notebook.Once enabled the toolbar will listen for any of these key actions and will then execute them.The possible commands are as follows:Run: Run Selected CellRun all: Run all CellsRestart Kernel: Restart the KernelShutdown Kernel: Shutdown the kernelSpell Checker on: Turns on the spell checking feature of the accessibility toolbarSpell Checker off: Turns off the spell checking feature of the accessibility toolbarView Commands: Show the table of available commandsStop Voice control: Turns off the voice control feature of the accessibility toolbarShow Planner: Opens the planner provided by the accessibility toolbarHide Planner: Minimises the planner provided by the accessibility toolbarDark Mode: Activates the dark theme provided by the accessibility toolbarHigh Contrast Mode: Activates the high contrast theme provided by the accessibility toolbarDefault Mode: Reverts the notebook to the default themeFeature summary can be found at:Voice ControlNotesThe Voice Control feature is only supported on Chrome at the moment.The Voice Control button will turn green when it is listening for a command.PlannerThe aim of the Planner is to provide a way for a user to plan out their notebook before and during its creation.The Planner is a Mardown text editor that is displayed at the side of the page and is saved together with the notebook.The Planner includes all of the standard Markdown features, with shortcuts to some of them as buttons at the top.Feature summary can be found at:PlannerNotesThe planner automatically saves every minute and can be manually saved using the "Save" buttonThere are various size options for images added to the planner:20% width: imagewidth=planner-2030% width: imagewidth=planner-3040% width: imagewidth=planner-4050% width: imagewidth=planner-5060% width: imagewidth=planner-6070% width: imagewidth=planner-7080% width: imagewidth=planner-8090% width: imagewidth=planner-90100% width: imagewidth=planner-100Accessible ThemesThe aim of the accessible themes is to provide a high contract and dark theme mode for users depending on their preferences.The themes can be easily toggled from the menu provided.Feature summary can be found at:Themes
accessible-graphs
Accessible graphs packageThis package enables you to experience graphs in an accessible manner if you're a blind person who uses a screen reader with or without a braille display.How it works?To get the accessible graph, you need to:Import the package as follows:import accessible_graphs_pkgCall the function "getAccessibleGraph" as follows:accessible_graphs_pkg.getAccessibleGraph(rawData, description, minValue, maxValue)The "getAccessibleGraph function accepts 4 arguments:raw Data - which could be:A list of numerical values representing the graphA dict, where the keys are the labels for the data, and the values are the numbers corresponding to each labeldescription - an optional string describing the graphminValue - an optional argument which tells the system that this is the minimum value. If not specified, the system calculates this value automatically based on the data rangeminValue - an optional argument which tells the system that this is the minimum value. If not specified, the system calculates this value automatically based on the data rangemaxValue - an optional argument which tells the system that this is the maximum value. If not specified, the system calculates this value automatically based on the data rangeExample 1Suppose we want to make the following graph accessible, represented by the following values:1500, 1300, 1700, 2000, 1000, 1450, 1900Suppose also we want to give the description "Demo stock example" to our graph.Then we call our function as follows:accessible_graphs_pkg.getAccessibleGraph([1500, 1300, 1700, 2000, 1000, 1450, 1900], 'Demo stock example')We then are supposed to get a graph similar to the one inthis linkNote that the minimum and maximum values are calculated automatically in this example.Example 2Let's now talk about more realistic example. Suppose we want to get the graph describing the following stock to be accessible, when represented by a list of key-value pairs, where the key is the day in the week, and the value is the value of the stock:Sunday: 1500Monday: 1300Tuesday: 1700Wednesday: 2000Thursday: 1000Friday: 1450Saturday: 1900Suppose also we want to give the description "Demo stock example" to our graph as before.Then we call our function as follows:accessible_graphs_pkg.getAccessibleGraph({'Sunday': 1500, 'Monday': 1300, 'Tuesday': 1700, 'Wednesday': 2000, 'Thursday': 1000, 'Friday': 1450, 'Saturday': 1900}, 'Demo stock example')Then we are supposed to get a graph similar to the one inthis linkNote that the minimum and maximum values are calculated automatically in this example.Example 3Suppose we want to make the graph represented by the following data to be accessible:1, 2, 3, 4, 5Suppose also we want to tell the system that the minimum value should be 2, and the maximum should be 4. Then we call our function as follows:accessible_graphs_pkg.getAccessibleGraph([1, 2, 3, 4, 5], description = 'some description', minValue= 2, maxValue = 4)Some useful linksAccessible Graphs basic guideAccessible Graphs braille tutorial
accessible_output
The accessible_output libraryAuthor:Christopher Toth <[email protected]>Date:$Date: 06-27-2011 02:00:00 -0400 (Mon, Jun 27, 2011)Web site:http://www.qwitter-client.net/Copyright:2011ContentsThe accessible_output libraryIntroductionBasic UsageSpeech OutputsBraille OutputsIntroductionAccessible Output provides a standard way for developers to output text in either speech or braille using a preinstalled screen reader. Using accessible_output makes creating self-voicing applications extremely easy.Basic UsageUsing accessible output is extremely simple:#!/usr/bin/env python from accessible_output import speech s = speech.Speaker() #Will load the default speaker. s.output("The message to speak")Speech OutputsJAWS for WindowsWindow EyesDolphin Screen Readers newer than v11.NVDA 2010.1 or newerSystem Access and System Access To GoMicrosoft sapi 5 speechSpeech DispatcherApple VoiceOverBraille OutputsJAWS for WindowsWindow EyesNVDASystem Access and System Access To Go
accessible-output2
Accessible Output 2 is an MIT licensed library for speaking and brailling through multiple screen readers and other accessibility systems.Accessible Output 2 makes selection of the appropriate speech and Braille output a snap, and also allows the programmer to select and use a specific output, for instance to force speaking through the Microsoft Speech API even if the user has a screen reader loaded.>>>importaccessible_output2.outputs.auto>>>o=accessible_output2.outputs.auto.Auto()>>>o.output("Some text")#attempts to both speak and braille the given text through the first available output>>>o.speak("Some other text",interrupt=True)#Speak some text through the output, without brailling it, and interrupt the currently-speaking text if anyAccessible Output 2 makes it simple to add spoken and brailled notifications to your applications on multiple platforms, facilitating accessibility for the visually impaired and also providing a nice alternative means of providing notifications to a sighted user.Supported Outputs:Speech:JAWS for WindowsNVDAWindow EyesSystem AccessSupernova and other Dolphin productsPC TalkerZDSRMicrosoft Speech APIBraille:JAWS for WindowsNVDAWindow EyesSystem AccessSupernova and other Dolphin products
accessible-output3
Accessible Output 3 is a fork of Accessible Output 2, with fixed mac supportAccessible Output 3 is an MIT licensed library for speaking and brailling through multiple screen readers and other accessibility systems.Accessible Output 3 makes selection of the appropriate speech and Braille output a snap, and also allows the programmer to select and use a specific output, for instance to force speaking through the Microsoft Speech API even if the user has a screen reader loaded.>>>importaccessible_output3.outputs.auto>>>o=accessible_output3.outputs.auto.Auto()>>>o.output("Some text")#attempts to both speak and braille the given text through the first available output>>>o.speak("Some other text",interrupt=True)#Speak some text through the output, without brailling it, and interrupt the currently-speaking text if anyAccessible Output 2 makes it simple to add spoken and brailled notifications to your applications on multiple platforms, facilitating accessibility for the visually impaired and also providing a nice alternative means of providing notifications to a sighted user.Supported Outputs:Speech:JAWS for WindowsNVDAWindow EyesSystem AccessSupernova and other Dolphin productsPC TalkerZDSRMicrosoft Speech APIBraille:JAWS for WindowsNVDAWindow EyesSystem AccessSupernova and other Dolphin products
accessible-pygments
Accessible pygments themesThis package includes a collection of accessible themes for pygments based on different sources.WCAG 2.1 - AAA compliantThe following themes are AAA compliant withWCAG 2.1 criteria for color contrast.a11y-darka11y-high-contrast-darkpitaya-smoothie- Colorblindness friendly.github-light- Colorblindness friendly.github-dark- Colorblindness friendly.github-light-colorblind- Colorblindness friendly.github-dark-colorblind- Colorblindness friendly.github-light-high-contrast- Colorblindness friendly.github-dark-high-contrast- Colorblindness friendly.gotthard-dark- Colorblindness friendly.WCAG 2.1 - AA compliantThe following themes are AA compliant withWCAG 2.1 criteria for color contrast.a11y-lighta11y-high-contrast-lightgotthard-light- Colorblindness friendly.blinds-light- Colorblindness friendly.blinds-dark- Colorblindness friendly.greative- Accessible to most forms of colorblindness and low light settings.For a demo of all our themes pleaseclick here!InstallationOur package is available in both conda and pip via,conda install -c conda-forge accessible-pygmentspip install accessible-pygmentsIf you want to install it directly from source,git clone [email protected]:Quansight-Labs/accessible-pygments.git cd accessible-pygments pip install .UsageImport it using the name identifier for the desired theme,from pygments.formatters import HtmlFormatter HtmlFormatter(style='a11y-light').style <class 'accessible-pygments.A11yLight'>TestsJust open a terminal and run,python test/run_tests.pyYou will see the results undertest/resultsin html format for each supported theme.AcknowledgementsWe want to thank the following sources for being the source of inspiration of one or more themes that are available in this repository,a11y dark and light syntax highlighting.pitaya smoothie vscode theme.github vscode themes.gotthard vscode themes.blinds vscode themes.greative vscode theme.
accessify
accessifyGetting startedWhat is accessifyAccess modifiersMotivationInterfacesMotivationHow to installUsageAccess modifiersPrivateProtectedOther featuresInterfacesSingle interfaceMultiple interfacesException throws declarationDisable checkingContributingReferencesGetting startedWhat is accessifyaccessifyis aPythondesign kit that provides:interfaces,declared exceptions throws,class members accessibility levels.that could be combined with each other to make your code slim and this library usage more justified.Access modifiersAccess level modifiers determine whether other classes can use a particular field or invoke a particular method. Accessibility levels are presented from the box in the languages likeC++,C#andJava.classCar{privatestringStartEngine(){// Code here.}}ButPythondoes not have this in the same way.MotivationWe're all consenting adults herethat is the part of thePython philosophythat relies on human factor instead of the interpreter.There is aPython conventionthat is to use an underscore prefix for protected and private members, that is a bit ugly. Isn't it? For instance, for the following piece of code that provides class a private member.classCar:def__start_engine(self,*args,**kwargs):passMoreover, private and protected methods could be easily accessed outside the class. This is really a point to postpone the correct design of the system to the backlog, increasing the technical debt.classCar:def_start_engine(self,*args,**kwargs):passdef__start_engine(self,*args,**kwargs):passcar=Car()car._start_engine()car._Car__start_engine()InterfacesAn interface is a contract specifying a set of methods and properties which required to be available on any implementing class. If the class implements an interface, but does not realize its method, corresponding errors should be raised. Interfaces are presented from the box in the languages likeC++,C#andJava.interfaceHumanInterface{publicstringEatFood();}classHuman:HumanInterface{publicstringEatFood(){// Code here.}}ButPythondoes not have this in the same way.MotivationThe interface makes checks during the implementation creation, but not actually while execution likeabcmodule inPython.The interface requires that implementation's method arguments match with arguments declared in interfaces,abc— not.A lot of libraries that provide interfaces are no longer supported.A lot of libraries that provide interfaces require you to write a lot of code to use its functionality, this library — not.How to installUsingpipinstall the package from thePyPi.$pip3installaccessifyUsageAccess modifiersPrivatePrivate members are accessible only within the body of the class.In this example, theCarclass contains a private member namedstart_engine. As a private member, they cannot be accessed except by member methods. The private memberstart_engineis accessed only by way of a public method calledrun.fromaccessifyimportprivateclassCar:@privatedefstart_engine(self):return'Engine sound.'defrun(self):returnself.start_engine()if__name__=='__main__':car=Car()assert'Engine sound.'==car.run()car.start_engine()The code above will produce the following traceback.Traceback(mostrecentcalllast):File"examples/access/private.py",line24,in<module>car.start_engine()File"/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/accessify/main.py",line92,inprivate_wrapperclass_name=instance_class.__name__,method_name=method.__name__, accessify.errors.InaccessibleDueToItsProtectionLevelException:Car.start_engine()isinaccessibleduetoitsprotectionlevelTest it out using theexamples. Get the example that contains the code above bycurland run it bypython3.$curl-Lhttps://git.io/fhASP>private.py $python3private.pyChild classes cannot access parent private members.In this example, theCarclass contains a private member namedstart_engine. As a private member, they cannot be accessed from the child classes,Teslain our case. So overridden methodrunbyTeslaclass cannot use the parent'sstart_enginemember.fromaccessifyimportprivateclassCar:@privatedefstart_engine(self):return'Engine sound.'classTesla(Car):defrun(self):returnself.start_engine()if__name__=='__main__':tesla=Tesla()tesla.run()The code above will produce the following traceback.Traceback(mostrecentcalllast):File"examples/inheritance/private.py",line23,in<module>tesla.run()File"examples/inheritance/private.py",line18,inrunreturnself.start_engine()File"/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/accessify/main.py",line94,inprivate_wrapperclass_name=class_contain.__name__,method_name=method.__name__, accessify.errors.InaccessibleDueToItsProtectionLevelException:Car.start_engine()isinaccessibleduetoitsprotectionlevelTest it out using theexamples. Get the example that contains the code above bycurland run it bypython3.$curl-Lhttps://git.io/fhASX>inheritence_private.py $python3inheritence_private.pyProtectedA protected member is accessible within its class and by derived class instances.In this example, theCarclass contains a protected member namedstart_engine. As a protected member, they cannot be accessed except by member methods. The protected memberstart_engineis accessed only by way of a public method calledrun.fromaccessifyimportprotectedclassCar:@protecteddefstart_engine(self):return'Engine sound.'defrun(self):returnself.start_engine()if__name__=='__main__':car=Car()assert'Engine sound.'==car.run()car.start_engine()The code above will produce the following traceback.Traceback(mostrecentcalllast):File"examples/access/protected.py",line21,in<module>car.start_engine()File"/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/accessify/main.py",line134,inprotected_wrapperclass_name=instance_class.__name__,method_name=method.__name__, accessify.errors.InaccessibleDueToItsProtectionLevelException:Car.start_engine()isinaccessibleduetoitsprotectionlevelTest it out using theexamples. Get the example that contains the code above bycurland run it bypython3.$curl-Lhttps://git.io/fhASM>protected.py $python3protected.pyChild classes have access to those protected members.In this example, theCarclass contains a protected member namedstart_engine. As a protected member, they can be accessed from the child classes,Teslain our case. So overridden methodrunbyTeslaclass can use the parent'sstart_enginemember.fromaccessifyimportprotectedclassCar:@protecteddefstart_engine(self):return'Engine sound.'classTesla(Car):defrun(self):returnself.start_engine()if__name__=='__main__':tesla=Tesla()assert'Engine sound.'==tesla.run()The code will work without errors.Test it out using theexamples. Get the example that contains the code above bycurland run it bypython3.$curl-Lhttps://git.io/fhASD>inheritence_protected.py $python3inheritence_protected.pyOther featuresTheaccessifydecorator removes private and protected members from classdir.fromaccessifyimportaccessify,private@accessifyclassCar:@privatedefstart_engine(self):return'Engine sound.'if__name__=='__main__':car=Car()assert'start_engine'notindir(car)Test it out using theexamples. Get the example that contains the code above bycurland run it bypython3.$curl-Lhttps://git.io/fhASy>dir.py $python3dir.pyInterfacesSingle interfaceWhen you declare that class implements an interface, a class should implementall methodspresented in the interface.In this example, there is an interface calledHumanInterfacethat contains two methodsloveandeat. Also, there is a classHumanthat implements the interface butmissed method «eat», so the corresponding error should be raised.fromaccessifyimportimplementsclassHumanInterface:@staticmethoddefeat(food,*args,allergy=None,**kwargs):passif__name__=='__main__':@implements(HumanInterface)classHuman:passThe code above will produce the following traceback.Traceback(mostrecentcalllast):File"examples/interfaces/single.py",line18,in<module>@implements(HumanInterface)File"/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/accessify/interfaces.py",line66,indecoratorinterface_method_arguments=interface_method.arguments_as_string, accessify.errors.InterfaceMemberHasNotBeenImplementedException:classHumandoesnotimplementinterfacememberHumanInterface.eat(food,args,allergy,kwargs)Test it out using theexamples. Get the example that contains the code above bycurland run it bypython3.$curl-Lhttps://git.io/fhh2V>single_method.py $python3single_method.pyWhen you declare that class implements an interface, a class should implement all methods that presented in the interface includingnumber, order and naming of the accepting arguments.In this example, there is an interface calledHumanInterfacethat contains two methodsloveandeat. Also, there is a classHumanthat implements the interface butmissed 3 of 4 arguments for method «eat», so the corresponding error should be raised.fromaccessifyimportimplementsclassHumanInterface:@staticmethoddefeat(food,*args,allergy=None,**kwargs):passif__name__=='__main__':@implements(HumanInterface)classHuman:@staticmethoddefeat(food):passThe code above will produce the following traceback.Traceback(mostrecentcalllast):File"examples/interfaces/single_arguments.py",line16,in<module>@implements(HumanInterface)File"/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/accessify/interfaces.py",line87,indecoratorinterface_method_arguments=interface_method.arguments_as_string, accessify.errors.InterfaceMemberHasNotBeenImplementedWithMismatchedArgumentsException:classHumanimplementsinterfacememberHumanInterface.eat(food,args,allergy,kwargs)withmismatchedargumentsTest it out using theexamples. Get the example that contains the code above bycurland run it bypython3.$curl-Lhttps://git.io/fhh2w>single_arguments.py $python3single_arguments.pyWhen you declare that class implements an interface, a class should implement all methods that presented in the interface including number, order and naming of the accepting arguments andaccess modifier type.In this example, there is an interface calledHumanInterfacethat contains two methodsloveandeat. Also, there is a classHumanthat implements the interface butmissed private access modifier type for method «eat», so the corresponding error should be raised.fromaccessifyimportimplements,privateclassHumanInterface:@private@staticmethoddefeat(food,*args,allergy=None,**kwargs):passif__name__=='__main__':@implements(HumanInterface)classHuman:@staticmethoddefeat(food,*args,allergy=None,**kwargs):passThe code above will produce the following traceback.Traceback(mostrecentcalllast):File"examples/interfaces/single_access.py",line18,in<module>@implements(HumanInterface)File"/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/accessify/interfaces.py",line77,indecoratorinterface_method_name=interface_method.name, accessify.errors.ImplementedInterfaceMemberHasIncorrectAccessModifierException:Human.eat(food,args,allergy,kwargs)mismatchesHumanInterface.eat()memberaccessmodifier.Test it out using theexamples. Get the example that contains the code above bycurland run it bypython3.$curl-Lhttps://git.io/fhh2r>single_access.py $python3single_access.pyMultiple interfacesA class could implement multiple interfaces.When you declare that class that implements a bunch of interfaces, a class should implement all method that presented in each interface including number, order and naming of the accepting arguments and access modifier type.In this example, there are an interfaceHumanSoulInterfacethat contains a method calledloveand interfaceHumanBasicsInterfacethat contains a method calledeat. Also, there is a classHumanthat implements methodlovefrom the first interface, butmissed method «eat»from the second one, so the corresponding error should be raised.fromaccessifyimportimplementsclassHumanSoulInterface:deflove(self,who,*args,**kwargs):passclassHumanBasicsInterface:@staticmethoddefeat(food,*args,allergy=None,**kwargs):passif__name__=='__main__':@implements(HumanSoulInterface,HumanBasicsInterface)classHuman:deflove(self,who,*args,**kwargs):passThe code above will produce the following traceback.Traceback(mostrecentcalllast):File"examples/interfaces/multiple.py",line19,in<module>@implements(HumanSoulInterface,HumanBasicsInterface)File"/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/accessify/interfaces.py",line66,indecoratorinterface_method_arguments=interface_method.arguments_as_string, accessify.errors.InterfaceMemberHasNotBeenImplementedException:classHumandoesnotimplementinterfacememberHumanBasicsInterface.eat(food,args,allergy,kwargs)Test it out using theexamples. Get the example that contains the code above bycurland run it bypython3.$curl-Lhttps://git.io/fhh2o>multiple.py $python3multiple.pyException throws declarationWhen you declare that interface method throws a particular exception, a class method that implement interface should contain code in the body that raise this exception.You can declare that the interface method throws multiple exceptions.In this example, exceptionHumanDoesNotExistsErrorand exceptionHumanAlreadyInLoveErrorare declared to be raised by theHumanclass method calledlove, but methodmissed to raise the second exception, so the corresponding error should be raised.fromaccessifyimportimplements,throwsclassHumanDoesNotExistsError(Exception):passclassHumanAlreadyInLoveError(Exception):passclassHumanInterface:@throws(HumanDoesNotExistsError,HumanAlreadyInLoveError)deflove(self,who,*args,**kwargs):passif__name__=='__main__':@implements(HumanInterface)classHuman:deflove(self,who,*args,**kwargs):ifwhoisNone:raiseHumanDoesNotExistsError('Human whom need to love does not exist.')The code above will produce the following traceback.Traceback(mostrecentcalllast):File"examples/interfaces/throws.py",line21,in<module>@implements(HumanInterface)File"/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/accessify/interfaces.py",line103,indecoratorclass_method_arguments=class_member.arguments_as_string, accessify.errors.DeclaredInterfaceExceptionHasNotBeenImplementedException:DeclaredexceptionHumanAlreadyInLoveErrorbyHumanInterface.love()memberhasnotbeenimplementedbyHuman.love(self,who,args,kwargs)Test it out using theexamples. Get the example that contains the code above bycurland run it bypython3.$curl-Lhttps://git.io/fhh26>throws.py $python3throws.pyDisable checkingYou can disable allaccessifychecks. For instance, in the production, when you shouldn't check it because it already was checked in the development. Use the following environment variable then:exportDISABLE_ACCESSIFY=TrueContributingClone the project and install requirements:[email protected]:dmytrostriletskyi/accessify.git&&cdaccessify $pip3install-rrequirements-dev.txt $pip3install-rrequirements-tests.txtIf you prefer working with theDockerand wanna easily changePythonenvironments, follow:[email protected]:dmytrostriletskyi/accessify.git&&cdaccessify $exportACCESSIFY_PYTHON_VERSION=3.4 $dockerbuild--build-argACCESSIFY_PYTHON_VERSION=$ACCESSIFY_PYTHON_VERSION-taccessify.-fDockerfile-python3.x $dockerrun-v$PWD:/accessify--nameaccessifyaccessifyEnter the container bash, checkPythonversion and run tests:$dockerexec-itaccessifybash $root@36a8978cf100:/accessify#python--version $root@36a8978cf100:/accessify#pytest-vvtestsClean container and images with the following command:$dockerrm$(dockerps-a-q)-f $dockerrmi$(dockerimages-q)-fWhen you will make changes, ensure your code passthe checkersand is covered by tests usingpytest.If you are new for the contribution, please read:Read about pull requests —https://help.github.com/en/articles/about-pull-requestsRead how to provide pull request —https://help.github.com/en/articles/creating-a-pull-request-from-a-forkAlso the useful article about how to contribute —https://akrabat.com/the-beginners-guide-to-contributing-to-a-github-project/ReferencesCheck it out to familiarize yourself with class members accessibility levels:C# accessibility levels —https://docs.microsoft.com/en-us/dotnet/csharp/language-reference/keywords/accessibility-levelsJava accessibility levels —https://docs.oracle.com/javase/tutorial/java/javaOO/accesscontrol.htmlObject-oriented programming interfaces —https://www.cs.utah.edu/~germain/PPS/Topics/interfaces.htmlInterfaces in C# —https://docs.microsoft.com/en-us/dotnet/csharp/language-reference/keywords/interfaceInterfaces in Java —https://docs.oracle.com/javase/tutorial/java/concepts/interface.html
accession
accessionis a Python module and command line tool for submitting genomics pipeline analysis output files and metadata to the ENCODE Portal.InstallationNote: installation requires Python >= 3.8$pipinstallaccessionNext, provide your API keys from the ENCODE portal:$exportDCC_API_KEY=XXXXXXXX$exportDCC_SECRET_KEY=yyyyyyyyyyyIt is highly recommended to set the DCC_LAB and DCC_AWARD environment variables for ease of use. These correspond to the lab and award identifiers given by the ENCODE portal, e.g./labs/foo/andU00HG123456, respectively.$exportDCC_LAB=XXXXXXXX$exportDCC_AWARD=yyyyyyyyyyyIf you are accessioning workflows produced using theCaperlocal backend, then installation is complete. However, if using WDL metadata from pipeline runs on Google Cloud, you will also need to authenticate with Google Cloud. Run the following two commands and follow the prompts:$gcloudauthlogin--no-launch-browser$gcloudauthapplication-defaultlogin--no-launch-browserIf you would like to be able to pass Caper workflow IDs or labels you will need to configure access to the Caper server. If you are invokingaccessionfrom a machine where you already have a Caper set up, and you have the Caper configuration file available at~/.caper/default.conf, then there is no extra setup required. If the Caper server is on another machine, you will need so configure HTTP access to it by setting thehostnameandportvalues in the Caper conf file.(Optional) Finally, to enable using Cloud Tasks to upload files from Google Cloud Storage to AWS S3, set the following two environment variables. If one or more of them is not set, then files will be uploaded using the same machine that the accessioning code is run from. For more information on how to set up Cloud Tasks and the upload service, see the docs for thegcs-s3-transfer-service$exportACCESSION_CLOUD_TASKS_QUEUE_NAME=my-queue$exportACCESSION_CLOUD_TASKS_QUEUE_REGION=us-west1To accession workflows produced on AWS backend you will need to set up AWS credentials. The easiest way to do this is to install the AWS CLI and runaws configureUsage$accession-mmetadata.json\-pmirna\-sdevPlease see thedocsfor greater detail on these input parameters.Deploying on Google CloudFirst authenticate with Google Cloud viagcloud auth loginif needed. Then install the API client withpip install google-api-python-client, it is recommended to do this inside of avenv. Finally, create the firewall rule and deploy the instance by runningpython deploy.py –project $PROJECT. This will also install theaccessionpackage. Finally, SSH onto the new instance and rungcloud auth loginto authenticate on the instance.For Caper integration, once the instance is up, SSH onto it and create the Caper conf file at~/.caper/default.conf, use the private IP of the Caper VM instance as thehostnameand use8000for theport. For the connection to work the Caper VM will need to have the tagcaper-server. Also note that the deployment assumes the Cromwell server port is set to8000.AWS NotesTo enable S3 to S3 copy from the pipeline buckets to the ENCODE buckets, ensure that the pipeline bucket policy grants read access to the ENCODE account. Here is an example policy:{"Version":"2012-10-17","Statement":[{"Sid":"DelegateS3AccessGet","Effect":"Allow","Principal":{"AWS":["arn:aws:iam::618537831167:root","arn:aws:iam::159877419961:root"]},"Action":"s3:GetObject","Resource":"arn:aws:s3:::PIPELINE-BUCKET/*"},{"Sid":"DelegateS3AccessList","Effect":"Allow","Principal":{"AWS":["arn:aws:iam::618537831167:root","arn:aws:iam::159877419961:root"]},"Action":"s3:ListBucket","Resource":"arn:aws:s3:::PIPELINE-BUCKET"}]}Project Informationaccessionis released under theMITlicense, documentation lives inreadthedocs, code is hosted ongithuband the releases onPyPI.
accession2taxid
accession2taxid: saving NCBI accession2taxid file to database for queryinginstallationpip install accession2taxidusagesavingaccession2taxid -i prot.accession2taxid.FULLqueryingfromaccession2taxidimportget_scoped_session,Accession2Taxidwithget_scoped_session()assession:session.query(Accession2Taxid).filter_by(accession='A0A0A0MTA4')
accessiontk
accessiontkTools to manage lists of accessions.webmapCreates nice web maps using leaflet.js and vue.js.Usage:Create a csv/tsv with the following columns (colnames flexible):accession,location,datetime,lat,long,location_description A1,L1,2022-01-01,48.4,2.2,"somebody's field"Collect image data into a structure like$treeimages/ images ├──A1 │├──20220324120033.jpg │├──20220324120034.jpg │├──20220324120035.jpg ├──A2│├──20220324120384.jpg │├──20220324120485.jpg │├──20220324120480.jpg │└──20220324120539.jpg └──L1├──20220324120871.jpg├──20220324120880.jpg├──20220324120821.jpg└──20220324120857.jpgRunaccessiontk-webmap. Set the--*-colnamearguments to match your metadata table, give-i,-o, and-t, and then run it. Seeaccessiontk-webmap --helpfor info
access-jamessaxon
# Spatial AccessThis package provides classical and novel measures of spatial accessibility to services.For full documentation, see [access.readthedocs.io](https://access.readthedocs.io/en/latest/).
accesskisters
No description available on PyPI.
accesskit
AccessKitThese are the bindings to use AccessKit from Python.Documentation for the Rust packages can be foundhere.An example program showing how to integrate AccessKit in a pygame application is availablehere.Building from a Source DistributionIf there are no wheels available for your platform, you will have to build one yourself. You will need to have Rust installed on your system, so that the native libraries can be compiled. Please visitrustup.rsfor instructions on how to proceed.Building from within the repositoryThis project usesmaturinas its build tool. If you need to manually build wheels for development purposes, it is recommended to install it inside a virtual environment. All maturin commands must be issued from this repository's root directory.
accesslog
AccessLog Parser and CLIWeb server access log parser and CLI tool with added features for web archive replay logs.InstallationThis package requires Python 3.6 or above. Install the latest version of the released package andaccesslogCLI tool from PyPi:$ pip install accesslogAlternatively, build and install the development version of the package:$ git clone https://github.com/oduwsdl/accesslog-parser.git $ cd accesslog-parser $ python3 setup.py install $ accesslog --versionInput ParsingTODORecord FilteringTODOOutput FormattingTODOCLI Reference$ accesslog -h usage: accesslog [options] [FILES ...] A tool to parse Common Log formatted access logs with various derived fields. positional arguments: files Log files (plain/gz/bz2) to parse (reads from the STDIN, if empty or '-') optional arguments: -h, --help Show this help message and exit -v, --version Show version number and exit -d, --debug Show debug messages on STDERR -e FIELDS, --nonempty FIELDS Skip record if any of the provided fields is empty (comma separated list) -i FIELDS, --valid FIELDS Skip record if any of the provided field values are invalid ('all' or comma separated list from 'host,request,status,size,referrer') -m FIELD~RegExp, --match FIELD~RegExp Skip record if field does not match the RegExp (can be used multiple times) -t TFORMAT, --origtime TFORMAT Original datetime format of logs (default: '%d/%b/%Y:%H:%M:%S %z') -f FORMAT, --format FORMAT Output format string (see available formatting fields below) -j FIELDS, --json FIELDS Output NDJSON with the provided fields (use 'all' for all fields except 'origline') formatting fields: {origline} Original log line {host} IP address of the client {identity} Identity of the client, usually '-' {user} User ID for authentication, usually '-' {origtime} Original date and time (typically in '%d/%b/%Y:%H:%M:%S %z' format) {epoch} Seconds from the Unix epoch (derived from origtime) {date} UTC date in '%Y-%m-%d' format (derived from origtime) {time} UTC time in '%H:%M:%S' format (derived from origtime) {datetime} 14 digit datetime in '%Y%m%d%H%M%S' format (derived from origtime) {request} Original HTTP request line {method} HTTP method (empty for invalid request) {path} Path and query (scheme and host removed, empty for invalid request) {prefix} Memento endpoint path prefix (derived from path) {mtime} 14 digit Memento datetime (derived from path) {rflag} Memento rewrite flag (derived from path) {urir} Memento URI-R (derived from path) {httpv} HTTP version (empty for invalid request) {status} Returned status code {size} Number of bytes returned {referrer} Referer header (empty, if not logged) {agent} User-agent header (empty, if not logged) {extras} Any additional logged fields Default FORMAT: '{host} {date} {time} {method} {path} {status} {size} "{referrer}" "{agent}"'
access-logs-local
Load the content of gzipped Apache HTTP log files Exclude bots, scrapers, etc., select URLs matching the provided regex(es), and generate a CSV of the relevant log entries.Take postprocessed logs and strip out multiple hits in sessions, and resolve URLs to the chosenURI_SCHEME(e.g.info:doi).We strip out entries where the same (IP address * user agent) pair has accessed a URL within the lastSESSION_TIMEOUT(e.g. half-hour)Additionally, we convert the URLs to ISBNs and collate request data by date, outputting a CSV for ingest via the stats system.Release Notes:[0.0.7] - 2024-01-05Changed:Deletion of the spiders filter inprocess_download_logs.py[0.0.6] - 2023-08-13Changed:Refactored driver logicbreaking| Changed parameters for theRequest.__init__()methodRemovedre_match_dictparameterAddedtimestampanduser_agentparametersChanged Request.timestamp from typetimetodatetimeChanged LogStream to use the newRequest.__init__()Expanded range forLogStream.logfile_nameslogic to include files within 1 day of the search_dateLogStream.lines()yieldsRequestobjects, notstrvaluesLogStream.filter_in_line_request()only yields one line per measure[0.0.5] - 2023-07-03Changed:Added start_date and end_date for searching in the log filesAdded the measure_uri to the result[0.0.4] - 2023-07-31Changed:Update file structure and name of the driver[0.0.3] - 2023-07-25Changed:Update requirementsUpdate using a pyproject.toml file as well as the new deployment structure[0.0.2] - 2023-07-11Added:UnittestsChanged:Moved the files out of the package and get the file’s data as parameters and return the filtered data.renamed the plugin to access-logs-local
access-med-utils
med-utils
access-modifiers
Access modifiers for PythonThis package provides two access modifiers for Python: private methods and protected methods. The goal is to be able to document methods as being private or protected and to provide basic guards against accidentally calling private and protected methods from outside the allowed scopes.How to useExample usage of private methods:fromaccess_modifiersimportprivatemethodclassClass:@privatemethoddefprivate_method(self)->str:return"private method"defpublic_method(self)->str:return"public method calls "+self.private_method()c=Class()print(c.public_method())# Prints "public method calls private method"print(c.private_method())# Raises an exceptionExample usage of protected methods:fromaccess_modifiersimportprotectedmethodclassClass:@protectedmethoddefprotected_method(self)->str:return"protected method"defpublic_method(self)->str:return"public method calls "+self.protected_method()classSubclass(Class):@protectedmethoddefprotected_method(self)->str:return"overridden protected method calls "+super().protected_method()c=Subclass()print(c.public_method())# Prints "public method calls overridden protected method calls protected method"print(c.protected_method())# Raises an exceptionPrivate methods can be combined with static methods. Note that the order matters: staticmethod should be the outermost decorator.fromaccess_modifiersimportprivatemethodclassClass:@staticmethod@privatemethoddefstatic_private_method()->str:return"static private method"defpublic_method(self)->str:return"public method calls "+self.static_private_method()c=Class()print(c.public_method())# Prints "public method calls static private method"print(c.static_private_method())# Raises an exceptionCombining protected methods with static methods is not supported. Combining access modifiers with class methods is not supported (yet).PerformanceThe access modifier decorators work by looking at the code that is calling the decorator to decide whether it is allowed to call the method. To do so, the decorators use implementation details of CPython, like sys._getframe() and the names of code objects such as lambdas and modules. These checks are done on each method call. Consequently, there is a considerable performance impact. Therefore it's recommended to use the access modifiers during testing and turn them off in production using theaccess_modifiers.disable()method. Note that you need to call this method before any of the access modifier decorators are evaluated, i.e.:fromaccess_modifiersimportdisable,privatemethoddisable()# This will disable the access checksclassClass:@privatemethoddefprivate_method(self)->str:return"private_method"disable()# Calling disable here will not work, Class.private_method has already been wrappedInstallationThe package is available from the Python Package Index, install withpip install access-modifiers.DevelopmentTo clone the repository:git clone [email protected]:fniessink/access-modifiers.git.To install the development dependencies:pip install -r requirements-dev.txt.To run the unittests and measure the coverage (which should always be at 100%):ci/unittest.sh.To run Pylint (which should score a 10) and Mypy (which shouldn't complain):ci/quality.sh.The implementation is driven by (unit) tests and has 100% unit test statement and branch coverage. Please look at the tests to see which usage scenario's are currently covered.
accessmysqlconverter
AccessMySQLConverterAccessMySQLConverter aims to provide a tool which converts MS Access database files (.mdb, .accdb) into a SQL file (compatible with PostgreSQL, MariaDB and MySQL) that can be run, generating It's structure (tables, ERM...) and It's dataInstallationTo install it you must have Python 3.x installed and run in the command promptpip install --no-cache-dir --upgrade accessmysqlconverterRun itFor executing the program run in the command prompt the following instructionpython -m accessmysqlconverter.applicationThe tool is limited by the driver so after converting an Access Database you should look for:Nullability of columns and Default valuesCheck ERM for missing 1:1, 1:N or N:NCollation, character setForeign Keys ON DELETE actionsLicenseSeeLICENSEfor more informationDonations
access-niu
[![Build Status](https://travis-ci.org/accessai/access-niu.svg?branch=master)](https://travis-ci.org/accessai/access-niu)[![Python 3.6](https://img.shields.io/badge/python-3.6-blue.svg)](https://www.python.org/downloads/release/python-360/)[![License](https://img.shields.io/badge/License-Apache%202.0-blue.svg)](https://opensource.org/licenses/Apache-2.0)[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/ambv/black)# access-niuThis repository contains application to train models for Image classification and Regression tasks.## Tasks- [x] Create a basic app for training and inference- [ ] Support for training regression models- [ ] Support of training multi input/output models- [ ] Incorporate Bayesian Inference for finding uncertainty in the predictions.- [ ] Create Docker Image- [ ] Support for serving the application with gunicorn- [ ] Anything else?## Installation```bashpip install access-niu```## Training```bashpython -m access_niu.train --template access_niu/sample/colors/sample_template.yml```## Inference```bashpython -m access_niu.wsgi --project ./colors```Now use this curl command to parse```bashcurl -X POST \http://localhost:8000/parse \-F data=@image_leisure_0.jpg```## References- This project is inspired from [RASA-NLU](https://github.com/RasaHQ/rasa) project.
access-nri-intake
Tools and configuration info used to manage ACCESS-NRI’s intake catalogRead thedocumentation here.DocumentationPackageCI/CDDevelopmentLicense
accessor
UNKNOWN
accessory
AccessoryGeneralized functional getters and setters, via profunctor optics.
access-outlook-email
Access Outlook EmailThis project contains the basic files to send and save an email/attachment through the moduleexchangelib.InstallationNote that the module requiresPython 3.6or higher.To install the module, run a pip command like the following:pip install access_outlook_emailUsage create_accountSimply import the modules via the following import statements:from accessOutlookEmail import create_account account = create_account('[email protected]', '***')Thecreate_accountfunction requires your Outlook-Exchange email-address and your password.Usage sendSimply import the modules via the following import statements:from accessOutlookEmail import send_emailThesend_emailfunction requires your account, subject, body, recipients and optional attachments.To send an email, the following code provides the core functionality:send_email(account, 'TestSubject', 'TestBody', ['[email protected]'], ['path/to/your/attachments'])Usage saveSimply import the modules via the following import statements:from accessOutlookEmail import save_attachmentThesave_attachmentfunction requires your Outlook Exchange Folder, which needs to be located in your inbox. Also it needs a path where to save and an exchangelib account.To save an attachment, the following code provides the core functionality:save_attachment(account, 'TestSubject', 'TestBody', ['[email protected]'])LicenseThis project is licensed by a MIT License.Project statusThe current released version is 0.1.5.
accesspanel
No description available on PyPI.
access-parser
AccessDB Parser (Pure Python)Microsoft Access (.mdb / .accdb) database files parser. The parsing logic is fully written in python and works without any external binary dependencies.InstallingUse pip:pip install access-parserOr install manually:gitclonehttps://github.com/ClarotyICS/access_parser.gitcdaccess_parser python3setup.pyinstallDemoUsage Examplefromaccess_parserimportAccessParser# .mdb or .accdb filedb=AccessParser("/path/to/mdb/file.mdb")# Print DB tablesprint(db.catalog)# Tables are stored as defaultdict(list) -- table[column][row_index]table=db.parse_table("table_name")# Pretty print all tablesdb.print_database()Known IssuesThis library was tested on a limited subset of database files. Due to the differences between database versions and the complexity of the parsing we expect to find more parsing edge-cases.To help us resolve issues faster please provide as much data as you can when opening an issue - DB file if possible and full trace including log messages.ThanksThis library was made possible by the great work by mdb-tools. The logic in this library heavily relies on the excellent documentation they havehttps://github.com/brianb/mdbtoolsHuge thanks to Mashav Sapir for the help debugging, CRing and contributing to this projecthttps://github.com/mashavs
access-parser-c
AccessDB Parser (Pure Python)Microsoft Access (.mdb / .accdb) database files parser. The parsing logic is fully written in python and works without any external binary dependencies.InstallingUse pip:pip install git+https://github.com/McSash/access_parser_cOr install manually:gitclonehttps://github.com/McSash/access_parser_c.gitcdaccess_parser_c python3setup.pyinstallDemoUsage Examplefromaccess_parser_cimportAccessParser# .mdb or .accdb filedb=AccessParser("/path/to/mdb/file.mdb")# Print DB tablesprint(db.catalog)# Tables are stored as defaultdict(list) -- table[column][row_index]table=db.parse_table("table_name")# Pretty print all tablesdb.print_database()Another Usage Examplefromaccess_parserimportAccessParserfromazure.storage.blobimportContainerClient# Download access file from azure blob storageBlobContainerClient=ContainerClient.from_connection_string("<StorageConnectionString>",container_name="<ContainerName>")access_object=BlobContainerClient.download_blob("<PathToFile>").readall()# Bytes object of .mdb or .accdb filedb=AccessParser(access_object)# Print DB tablesprint(db.catalog)# Tables are stored as defaultdict(list) -- table[column][row_index]table=db.parse_table("table_name")# Pretty print all tablesdb.print_database()Known IssuesOLE fields are currently not supportedOnly a subset of memo fields are parsedThis library was tested on a limited subset of database files. Due to the differences between database versions and the complexity of the parsing we expect to find more parsing edge-cases.To help us resolve issues faster please provide as much data as you can when opening an issue - DB file if possible and full trace including log messages.ThanksThis fork was made possible by the great work by claroty:https://github.com/claroty/access_parser
access-points
No description available on PyPI.
accessPolygon
Max Christ’s Final Project in Programming for Business IntelligenceGet itpip3 install accessPolygonUse it# Import main from libraryfromaccessPolygon.accessimportmain
access-sftp-server
Access SFTP ServerThis project contains the basic files to up- and download files from a SFTP Server.For further information contactValentin Baier.InstallationThe installation can be achieved by usingpip:python -m pip install access_sftp_serverRun via ShellTo run the application via shell you need to either execute:python -m cibc_sftp_server.upload_filesorpython -m cibc_sftp_server.download_filesUsageThe package can be executed from any computer having the required credentials to the server.LicenseThis project is licensed by a MIT License.Project statusThe current released version is 0.1.0.
access-spotify
access_spotifyThis project is for all the music obsessives out there.It will eventually evolve into a package that offers advanced functionality for utilizing the Spotify API.Currently, the CLI interface queries Spotify for the specified artist (default = 'The Beatles' because who else?) and downloads all the album art, and all the album audio features and analysis for the artist.You can also import individual album data directly into a data-frame via a script or notebook.The album audio features and analysis are pickled as separate data-frames, which can then be read into pandas and analyzed, visualized etc.It is recommended that the user work in their own virtual environment.Get your own Client_ID and Client_Secret fromhttps://developer.spotify.com/dashboard/applicationsFor detailed information on the API and what audio features and audio analysis you can get via this script, please refer tohttps://developer.spotify.com/documentation/web-api/reference/tracks/Installation and UsageInstall the access-spotify packagepip install access-spotifySee the help menuaccess_script.py --helpQuery the Spotify API via CLIaccess_script.py --artist_name 'Led Zeppelin' --client_id 'your-client-id' --client_secret 'your-client-secret'This saves all the album art (high resolution) and album track information (audio features and audio analysis) into pickled data-frames in the data/ folder.Getting data for individual albums from the Spotify APISee the example notebook on GithubAlsoThis is still very new, and I will continue to update the Documentation, functionality and add unit tests. Please let me know if you find any bugs or if you have any specific ideas for extending the functionality.To-DoDocumentationUnit testsImprove the argparse functionalityCool visualizations perhaps?Predictive Modeling? How sad will the next Radiohead album be?This project would not have been possible without Spotipy -->https://spotipy.readthedocs.io/en/2.13.0/-- Go buy them a coffee and give their github repo a star!
accessstatsapi
No description available on PyPI.
access-undenied-aws
Access Undenied on AWSAccess Undenied parses AWS AccessDenied CloudTrail events, explains the reasons for them, and offers actionable fixes.Access UndeniedOverviewCommon use casesSimple StartupInstallationInstallation from pipInstallation from source code (development)UsageGetting eventsPermissionsSame account assets only, no SCPsCross-account assets and SCPsCLI CommandsAnalyzeGet SCPsOutput FormatOutput FieldsAccessDeniedReason:ResultDetailsPoliciesToAddExplicitDenyPoliciesAcknowledgementsAppendicesSetting up a venvGetting Cloudtrail events from the AWS Console's event historyExample Cloudtrail eventLeast privilege AccessUndenied policyOverviewAccess Undenied analyzes AWS CloudTrail AccessDenied events, scans the environment to identify and explain the reasons for them, and offers actionable least-privilege remediation suggestions.Common use casesSometimes, thenew and more detailed AccessDenied messages provided by AWSwill be sufficient. However, that is not always the case.Some AccessDenied messages do not provide details. Among the services with (many or exclusively) undetailed messages are: S3, SSO, EFS, EKS, GuardDuty, Batch, SQS, and many more.When the reason for AccessDenied is an explicit deny, it can be difficult to track down and evaluate every relevant policy.Specifically when the reason is an explicit deny in a service control policy (SCP), one has to find and every single policy in the organization that applies to the account.When the problem is a missingAllowstatement, AccessUndenied automatically offers a least-privilege policy based on the CloudTrail event.Simple StartupInstall AccessUndenied:pip install access-undenied-awsAnalyze a CloudTrail event file:access-undenied-aws --file event_history.jsonInstallationInstallation from pippython -m pip install access-undenied-awsInstallation from source code (development)To install from source code, you canset up a venv(optionally), and within that venv.python -m pip install --editable .UsageGetting eventsAccess Undenied works by analyzing a CloudTrail event where access was denied and the error code is either AccessDenied or Client.UnauthorizedOperation, it works on an input of one or more CloudTrail events. You can get them from wherever you get events, they can be found in the event history in the console, or by the LookupEvents API, or through whatever system you use in order to filter and detect events: Athena, Splunk, others. You can either download the records file (the default format for multiple events) or just copy and paste a single event. For an example of how to do this:Getting Cloudtrail events from the AWS Console's event historyPermissionsAccess Undenied runs with the default permissions of the environment running the cli command, and accepts the--profileflag for using a different profile from .aws/credentials.access-undenied-aws --profile my-profile analyze --events-file cloudtrail_events.json(note that the location of the profile flag must be before the sub-command (which in this case isanalyze).The role running access-undenied-aws should be granted the appropriate permissions, to do so:Attach theSecurityAuditmanaged policy.If you would like to scan cross-account assets and analyze service control policies, attach the following inline policy. This policy allows AccessUndenied to assume roles in your other accounts:{"Version":"2012-10-17","Statement":[{"Sid":"AccessUndeniedAssumeRole","Effect":"Allow","Action":"sts:AssumeRole","Resource":["arn:aws:iam::<management_account_id>:role/AccessUndeniedRole","arn:aws:iam::<account_1_id>:role/AccessUndeniedRole","arn:aws:iam::<account_2_id>:role/AccessUndeniedRole","..."]}]}If you do not wish to attachSecurityAudit, you may instead attach the updatingleast-privilege AccessUndenied policy.Same account assets only, no SCPsWhen both the resource and the principal are in the same account as the credentials used to run AccessUndenied and Service Control Policies (SCPs) do not need to be considered, it is sufficient to just run AccessUndenied with default credentials or a profile, and you do not need to set up any additional profiles.Cross-account assets and SCPsTo consider assets in multiple accounts and/or SCPs in the management account, we need to set up AWS cross-account roles with thesame policyand the same name as each other (the default isAccessUndeniedRole)when setting up these roles, remember to set up the appropriate trust policy (trusting the credentials in the source account, the one you're running AccessUndenied in):{"Version":"2012-10-17","Statement":[{"Effect":"Allow","Principal":{"AWS":"arn:aws:iam::<source_account>:role/AccessUndeniedRole"},"Action":"sts:AssumeRole","Condition":{}}]}AttachSecurityAuditmanaged policy to the identity , or the updatingleast-privilege AccessUndenied policyCLI CommandsSimplest commandaccess-undenied-aws analyze --events-file cloudtrail_events.jsonAll options:Options: -v, --verbosity LVL Either CRITICAL, ERROR, WARNING, INFO or DEBUG --profile TEXT the AWS profile to use (default is default profile) --help Show this message and exit. Commands: analyze Analyzes AWS CloudTrail events and explains the reasons for... get-scps Writes the organization's SCPs and organizational tree to a fileAnalyzeThis command is used to analyze AccessDenied events. It can be used either with themanagement-account-role-arnparameter to retrieve SCPs, or with thescp-fileparameter to use a policy data file created by theget_scpscommand.Options: --events-file FILENAME input file of CloudTrail events [required] --scp-file TEXT Service control policy data file generated by the get_scps command. --management-account-role-arn TEXT a cross-account role in the management account of the organization, which must be assumable by your credentials. --cross-account-role-name TEXT The name of the cross-account role for AccessUndenied to assume. default: AccessUndeniedRole --output-file TEXT output file for results (default: no output to file) --suppress-output / --no-suppress-output should output to stdout be suppressed (default: not suppressed) --help Show this message and exit.Example:access-undenied-aws analyze --events-file events_file.jsonGet SCPsThis command is used to writes the organization's SCPs and organizational tree to an organizational policy data file. This command should be run from the management account.Options: --output-file TEXT output file for scp data (default: scp_data.json) --help Show this message and exit.Example:access-undenied-aws get-scpsThen when running analyzing (from the same account or a different account)access-undenied-aws analyze --events-file events_file.json --scp-file scp_data.jsonOutput Format{"EventId":"55555555-12ad-4f70-9140-d44428038119","AssessmentResult":"Missing allow in an identity-based policy","ResultDetails":{"PoliciesToAdd":[{"AttachmentTargetArn":"arn:aws:iam::123456789012:role/MyRole","Policy":{"Version":"2012-10-17","Statement":[{"Effect":"Allow","Action":"rds:DescribeDBInstances","Resource":"arn:aws:rds:ap-northeast-3:123456789012:db:*"}]}}]}}This output for example, tells us that access was denied because of there is noAllowstatement in an identity-based policy. To remediate, we should attach to the IAM rolearn:aws:iam::123456789012:role/MyRolethe policy:{"Version":"2012-10-17","Statement":[{"Effect":"Allow","Action":"rds:DescribeDBInstances","Resource":"arn:aws:rds:ap-northeast-3:123456789012:db:*"}]}Output FieldsAccessDeniedReason:The reason why access was denied. Possible ValuesMissing allow in:Identity policyResource policy (in cross-account access)Both (in cases of cross-account access)Permissions boundaryService control policy (with allow-list SCP strategy)Explicit deny from:Identity policyResource policyPermissions boundaryService control policyInvalid action:a principal or action that cannot be simulated by access undenied."Allowed" An"Allowed"result means that access undenied couldn't find the reason for AccessDenied, this could be for a variety of reasons:Policies, resources and/or identities have changed since the CloudTrail event and access now actually allowedUnsupported resource policy typeUnsupported policy type (VPC endpoint policy, session policy, etc.)Unsupported condition keyResultDetailsThese are the details of the result, explaining the remediation steps, this section may contain eitherPoliciesToAddorExplicitDenyPolicies.PoliciesToAddThese are the policies which need to be added to enable least-privilege access. Each policy contains:AttachmentTargetArn: the entity to which the new policy should be attachedPolicy: The content of the policy to be addedExplicitDenyPoliciesThese are the policies cause explicit deny, which need to be removed or modified to facilitate access. AccessUndenied also gives the specific statement causing theDenyoutcome.AttachmentTargetArn: the entity to which the policy causing explicit deny is currently attachedPolicyArn: The arn (if applicable) of the policy causing explicit deny. For the sake of convenience, resource policies are represented by generic placeholder arns such as:arn:aws:s3:::my-bucket/S3BucketPolicyPolicyName: The policy name, if applicable. Resource policies are represented by generic placeholder names such asS3BucketPolicyPolicyStatement: The specific statement in the aforementioned policy causing explicit denyAcknowledgementsThis project makes use of Ian Mckay'siam-datasetBen Kehoe'saws-error-utils.AppendicesSetting up a venvpython -m venv .venvPlatformShellCommand to activate virtual environmentPOSIXbash/zsh$ source .venv/bin/activatefish$ source .venv/bin/activate.fishcsh/tcsh$ source .venv/bin/activate.cshPowerShell Core$ .venv/bin/Activate.ps1Windowscmd.exeC:> .venv\Scripts\activate.batPowerShellPS C:> .venv\Scripts\Activate.ps1Getting Cloudtrail events from the AWS Console's event historyOpen the AWS consoleGo to "CloudTrail"In the sidebar on the left, click Event HistoryFind the event you're interested in checking. Unfortunately, the console doesn't let you filter by ErrorCode, so you'll have to filter some other way, e.g. by username or event name.Download the event:By clicking the event, copying the event record, and pasting it to a json file locally. or,By clicking download events -> download as JSON in the top-right corner. (Access Undenied will handle all events where the ErrorCode is AccessDenied or Client.UnauthorizedOperation)With the event saved locally, you may use thecli commandExample Cloudtrail eventOne event in file:{"awsRegion":"us-east-2","eventID":"5ac7912b-fd5d-436a-b60c-8a4ec1f61cdc","eventName":"ListFunctions20150331","eventSource":"lambda.amazonaws.com","eventTime":"2021-09-09T14:01:22Z","eventType":"AwsApiCall","userIdentity":{"accessKeyId":"ASIARXXXXXXXXXXXXXXXX","accountId":"123456789012","arn":"arn:aws:sts::123456789012:assumed-role/RscScpDisallow/1631196079303620000","principalId":"AROARXXXXXXXXXXXXXXXX:1631196079303620000","sessionContext":{"attributes":{"creationDate":"2021-09-09T14:01:20Z","mfaAuthenticated":"false"},"sessionIssuer":{"accountId":"123456789012","arn":"arn:aws:iam::123456789012:role/RscScpDisallow","principalId":"AROARXXXXXXXXXXXXXXXX","type":"Role","userName":"RscScpDisallow"},"webIdFederationData":{}},"type":"AssumedRole"},"errorCode":"AccessDenied","errorMessage":"User: arn:aws:sts::123456789012:assumed-role/RscScpDisallow/1631196079303620000 is not authorized to perform: lambda:ListFunctions on resource: * with an explicit deny","sourceIPAddress":"xxx.xxx.xxx.xxx","readOnly":true,"eventVersion":"1.08","userAgent":"aws-cli/2.2.16 Python/3.8.8 Linux/4.19.128-microsoft-standard exe/x86_64.ubuntu.20 prompt/off command/lambda.list-functions","requestID":"xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxxx","managementEvent":true,"recipientAccountId":"123456789012","eventCategory":"Management"}Multiple events in file:{"Records":[{"awsRegion":"us-east-1","eventID":"xxxxxxxx-xxxx-xxxx-xxxx-8234c1555c12"//... rest of cloudtrail_event ...},{//... another cloudtrail_event ...}// more events...]}Least privilege AccessUndenied policy{"Version":"2012-10-17","Statement":[{"Sid":"AccessUndeniedLeastPrivilegePolicy","Effect":"Allow","Action":["ecr:GetRepositoryPolicy","iam:Get*","iam:List*","iam:SimulateCustomPolicy","kms:GetKeyPolicy","lambda:GetPolicy","organizations:List*","organizations:Describe*","s3:GetBucketPolicy","secretsmanager:GetResourcePolicy","sts:DecodeAuthorizationMessage"],"Resource":"*"}]}
accfifo
TODO: Provide a complete README fileUsageCheck tests to see examples.DevelopmentNix shell is provided:nix-shellYou can choose between different Python versions:nix-shell --arg python "\"python39\"" nix-shell --arg python "\"python310\""Once you are inside the Nix shell, you can then run Visual Studio Code:code .TestingIf you are inside the Nix shell:python test_accfifo.pyWithout entering the Nix shell, for Python 3.9:nix-shell --arg python "\"python39\"" --command "python test_accfifo.py"... and for Python 3.10:nix-shell --arg python "\"python310\"" --command "python test_accfifo.py"LicenseThis library is licensed underBSD 2-Clause.
accialcli
No description available on PyPI.
accim
ACCIM stands for Adaptive Comfort Control Implemented Model.In research terms, this is a proposal for a paradigm shift, from using fixed PMV-based to adaptive setpoint temperatures, based on adaptive thermal comfort algorithms and it has been widely studied and published on scientific research journals (for more information, refer tohttps://orcid.org/0000-0002-3080-0821).In terms of code, this is a python package that transforms fixed setpoint temperature building energy models into adaptive setpoint temperature energy models by adding the Adaptive Comfort Control Implementation Script (ACCIS). This package has been developed to be used in EnergyPlus building energy performance simulations.The figure below clearly explains the aim of adaptive setpoint temperatures: introducing hourly temperature values into the adaptive comfort zone. On the left column, you can see the simulation results of a naturally ventilated building, while on the right column, the same building in mixed-mode operation with adaptive setpoint temperatures.1. CitationIf you use this package, please cite us:Sánchez-García, D., Bienvenido-Huertas, D., Rubio-Bellido, C., 2021.Computational approach to extend the air-conditioning usage to adaptive comfort: Adaptive-Comfort-Control-Implementation Script.Automation in Construction. 131, 103900.https://doi.org/10.1016/j.autcon.2021.103900Sánchez-García, D., Martínez-Crespo, J., Hernando, U.R.R., Alonso, C., 2023.A detailed view of the Adaptive-Comfort-Control-Implementation Script (ACCIS): The capabilities of the automation system for adaptive setpoint temperatures in building energy models.Energy and Buildings. 288.https://doi.org/10.1016/j.enbuild.2023.1130192. How to use2.0 RequirementsTo use accim, the following must be installed:Python 3.9EnergyPlus (any version between 9.1 and 23.1 those included)2.1 InstallationFirst of all, you need to install the package:pip install accim2.2 Usage2.2.1 Transforming PMV-based into adaptive setpoint temperaturesThis is a very brief explanation of the usage. Therefore, if you don't get the results you expected or get some error, I would recommend reading the 'Detailed use' section at the documentation in the linkhttps://accim.readthedocs.io/en/master/accim will take as input IDF files those located at the same path as the script. You only need to run the following code:2.2.1.1 Short versionfrom accim.sim import accis accis.addAccis()Once you run this code, you will be asked to enter some information at the terminal or python console to generate the output IDF files.2.2.1.2 Longer versionfrom accim.sim import accis accis.addAccis( ScriptType=str, # ScriptType: 'vrf_mm', 'vrf_ac', 'ex_mm', 'ex_ac'. For instance: ScriptType='vrf_ac', SupplyAirTempInputMethod=str, # SupplyAirTempInputMethod: 'supply air temperature', 'temperature difference'. For instance: SupplyAirTempInputMethod='supply air temperature', Output_keep_existing=bool, # Output_keep_existing: True or False. For instance: Output_keep_existing=False, Output_type=str, # Output_type: 'simplified', 'standard', 'detailed' or 'custom'. For instance: Output_type='standard', Output_freqs=list, # Output_freqs: ['timestep', 'hourly', 'daily', 'monthly', 'runperiod']. For instance: Output_freqs=['hourly', 'runperiod'], Output_gen_dataframe=bool, # Output_keep_existing: True or False. For instance: Output_keep_existing=False, Output_take_dataframe=pandas Dataframe, EnergyPlus_version=str, # EnergyPlus_version: '9.1', '9.2', '9.3', '9.4', '9.5', '9.6', '22.1', '22.2' or '23.1'. For instance: EnergyPlus_version='23.1', TempCtrl=str, # TempCtrl: 'temperature' or 'temp', or 'pmv'. For instance: TempCtrl='temp', ComfStand=list, # it is the Comfort Standard. Can be any integer from 0 to 21. For instance: ComfStand=[0, 1, 2, 3], CAT=list, # it is the Category. Can be 1, 2, 3, 80, 85 or 90. For instance: CAT=[3, 80], ComfMod=list, # it is Comfort Mode. Can be 0, 1, 2 or 3. For instance: ComfMod=[0, 3], SetpointAcc=float, # it is the accuracy of the setpoint temperatures CoolSeasonStart=dd/mm date in string format or integer to represent the day of the year, # it is the start date for the cooling season CoolSeasonEnd=dd/mm date in string format or integer to represent the day of the year, # it is the end date for the cooling season HVACmode=list, # it is the HVAC mode. 0 for Full AC, 1 for NV and 2 for MM. For instance: HVACmode=[0, 2], VentCtrl=list, # it is the Ventilation Control. Can be 0 or 1. For instance: VentCtrl=[0, 1], MaxTempDiffVOF=float, # When the difference of operative and outdoor temperature exceeds MaxTempDiffVOF, windows will be opened the fraction of MultiplierVOF. For instance: MaxTempDiffVOF=20, MinTempDiffVOF=float, # When the difference of operative and outdoor temperature is smaller than MinTempDiffVOF, windows will be fully opened. Between min and max, windows will be linearly opened. For instance: MinTempDiffVOF=1, MultiplierVOF=float, # Fraction of window to be opened when temperature difference exceeds MaxTempDiffVOF. For instance: Multiplier=0.2, VSToffset=list, # it is the offset for the ventilation setpoint. Can be any number, float or int. For instance: VSToffset=[-1.5, -1, 0, 1, 1.5], MinOToffset=list, # it is the offset for the minimum outdoor temperature to ventilate. Can be any positive number, float or int. For instance: MinOToffset=[0.5, 1, 2], MaxWindSpeed=list, # it is the maximum wind speed allowed for ventilation. Can be any positive number, float or int. For instance: MinOToffset=[2.5, 5, 10], ASTtol_start=float, # it is the start of the tolerance sequence. For instance: ASTtol_start=0, ASTtol_end_input=float, # it is the end of the tolerance sequence. For instance: ASTtol_start=2, ASTtol_steps=float, # these are the steps of the tolerance sequence. For instance: ASTtol_steps=0.25, NameSuffix=str # NameSuffix: some text you might want to add at the end of the output IDF file name. For instance: NameSuffix='whatever', verboseMode=bool # verboseMode: True to print all process in screen, False to not to print it. Default is True. For instance: verboseMode=True, confirmGen=bool # True to confirm automatically the generation of IDFs; if False, you'll be asked to confirm in command prompt. Default is False. For instance: confirmGen=False, )You can see a Jupyter Notebook in the link below:https://github.com/dsanchez-garcia/accim/blob/master/accim/sample_files/jupyter_notebooks/addAccis/using_addAccis.ipynbYou can also execute it at your computer. You just need to find the folder containing the .ipynb and all other files at the accim package folder within your site_packages path, inaccim/sample_files/jupyter_notebooks/addAccisThe path should be something like this, with your username instead of YOUR_USERNAME:C:\Users\YOUR_USERNAME\AppData\Local\Programs\Python\Python39\Lib\site-packages\accim\sample_files\jupyter_notebooks\addAccisThen, you just need to copy the folder to a different path (i.e. Desktop), open a cmd dialog pointing at it, and run "jupyter notebook". After that, an internet browser will pop up, and you will be able to open the .ipynb file.2.2.2 Renaming epw files for later data analysisYou can see a Jupyter Notebook in the link below:https://github.com/dsanchez-garcia/accim/blob/master/accim/sample_files/jupyter_notebooks/rename_epw_files/using_rename_epw_files.ipynbYou can also execute it at your computer. You just need to find the folder containing the .ipynb and all other files at the accim package folder within your site_packages path, inaccim/sample_files/jupyter_notebooks/rename_epw_filesThe path should be something like this, with your username instead of YOUR_USERNAME:C:\Users\YOUR_USERNAME\AppData\Local\Programs\Python\Python39\Lib\site-packages\accim\sample_files\jupyter_notebooks\rename_epw_filesThen, you just need to copy the folder to a different path (i.e. Desktop), open a cmd dialog pointing at it, and run "jupyter notebook". After that, an internet browser will pop up, and you will be able to open the .ipynb file.2.2.3 Running simulationsYou can see a Jupyter Notebook in the link below:https://github.com/dsanchez-garcia/accim/blob/master/accim/sample_files/jupyter_notebooks/runEp/using_runEp.ipynbYou can also execute it at your computer. You just need to find the folder containing the .ipynb and all other files at the accim package folder within your site_packages path, inaccim/sample_files/jupyter_notebooks/runEpThe path should be something like this, with your username instead of YOUR_USERNAME:C:\Users\YOUR_USERNAME\AppData\Local\Programs\Python\Python39\Lib\site-packages\accim\sample_files\jupyter_notebooks\runEpThen, you just need to copy the folder to a different path (i.e. Desktop), open a cmd dialog pointing at it, and run "jupyter notebook". After that, an internet browser will pop up, and you will be able to open the .ipynb file.2.2.4 Functions and methods for data analysis; making figures and tablesYou can see a Jupyter Notebook in the link below:https://github.com/dsanchez-garcia/accim/blob/master/accim/sample_files/jupyter_notebooks/Table/using_Table.ipynbYou can also execute it at your computer. You just need to find the folder containing the .ipynb and all other files at the accim package folder within your site_packages path, inaccim/sample_files/jupyter_notebooks/TableThe path should be something like this, with your username instead of YOUR_USERNAME:C:\Users\YOUR_USERNAME\AppData\Local\Programs\Python\Python39\Lib\site-packages\accim\sample_files\jupyter_notebooks\TableThen, you just need to copy the folder to a different path (i.e. Desktop), open a cmd dialog pointing at it, and run "jupyter notebook". After that, an internet browser will pop up, and you will be able to open the .ipynb file.2.2.5 Full example: from preparation of the input IDFs and EPWs, to simulation and data analysis and visualizationYou can see a Jupyter Notebook in the link below:https://github.com/dsanchez-garcia/accim/blob/master/accim/sample_files/jupyter_notebooks/full_example/full_example.ipynbYou can also execute it at your computer. You just need to find the folder containing the .ipynb and all other files at the accim package folder within your site_packages path, inaccim/sample_files/jupyter_notebooks/full_exampleThe path should be something like this, with your username instead of YOUR_USERNAME:C:\Users\YOUR_USERNAME\AppData\Local\Programs\Python\Python39\Lib\site-packages\accim\sample_files\jupyter_notebooks\full_exampleThen, you just need to copy the folder to a different path (i.e. Desktop), open a cmd dialog pointing at it, and run "jupyter notebook". After that, an internet browser will pop up, and you will be able to open the .ipynb file.3. DocumentationDetailed documentation, including the explanation of the different arguments, is at:https://accim.readthedocs.io/en/master/4. CreditsIt wouldn't have been possible to develop this python package without eppy, so thank you for such an awesome work.
accinv
(acc)-1accinv - Python project for inverse modeling of accelerator latticesNote:This project is in proof-of-concept stage and therefore lacks more advanced features of some of the implemented methods.In its current state, the package provides functionality for inverse modeling of linear optics via fitting of orbit response matrix (ORM) data, typically referred to as linear optics from closed orbits. It supportscpymadas a backend.The main class isaccinv.loco.Locowhich requires one of the models fromaccinv.model, as well as a method for computing Jacobians, as an argument. Two methods for Jacobian computation are available.AnalyticalJacobianMethod: This method uses an analytical formula to compute the Jacobian of the ORM with respect to quadrupole gradient errors and BPM and steerer gain errors. The data for the analytical formula is obtained from a single Twiss call for the current lattice configuration.NumericalMJacobianMethod: This method uses a finite difference approximation scheme to compute the Jacobian of the ORM with respect to the quadrupole gradient errors. Thus, the number of ORMs that will be computed is proportional to the number of quadrupoles.The inverse modeling process can be started by creating aLocoobject and calling itsrunmethod:fromaccinv.jacobianimportAnalyticalJacobianMethodfromaccinv.locoimportLoco,OrmMeasurementfromaccinv.modelimportMadxmodel=Madx(path='path/to/script.madx')loco=Loco(model_and_jacobian_method=(model,AnalyticalJacobianMethod),quadrupoles=[...],# names of quadrupoleshbpms=[...],# names of horizontal BPMshsteerers=[...],# names of horizontal steerersvbpms=[...],# names of vertical BPMsvsteerers=[...],# names of vertical steerersorm_measurement=OrmMeasurement(orm=np.load('path/to/measured_orm.npy'),uncertainty=np.load('path/to/orm_uncertainty.npy'),),)result=loco.run()Please consider the documentation of theLococlassfor more details.
accio
UNKNOWN
accioapi
apitest framework
accio-api
模块类的设计Request.py封装request方法,可以支持多协议扩展(get\post\put)Config.py读取配置文件,包括:不同环境的配置,email相关配置Log.py封装记录log方法,分为:debug、info、warning、error、criticalEmail.py封装smtplib方法,运行结果发送邮件通知Assert.py封装assert方法Hash.py封装常用加密方法Session.py封装获取登录cookies方法run.py核心代码。定义并执行用例集,生成报告
accio-client
Accessible IO Client Tool
accipio
Package containing a wide range of functions to make TensorFlow easy to use with little to no previous knowledge of AI!
accipy
AcciPyThis repository contains the AcciPy Programmable Query Application for querying the National Transportation Safety Board (NTSB) investigations database. This was developed by Gage Broberg, Wyatt McGinnis, Srihari Menon and Prof Nancy Currie-Gregg from the Systems Analysis & Functional Evaluation Laboratory (SAFELab) at Texas A&M University. This tool is open-source and free-to-use. Please reference our publication when you do!How it worksAcciPy works the same way that querying from the Carol Query web interface does, but is much more robust! When you submit a request to AcciPy via the query() function, AcciPy does the following:Tries to adapt the request to the requirements of the NTSB serversBased on the query parameters you give it, AcciPy decides whether the query can be safely downloaded to your machine in one request, or if the query needs to be based to NTSB in smaller chunks.Passes the revised query to NTSB asynchronously through http requestsPlaces received data from NTSB in a folder called './output' in the current directoryData chunks from queries that could not be completed in a single request in folders named with query parameters that generated them, so that you can always tell what data you haveAfter completing all necessary queries, combines all data received into the file './output/aggregated_data.csv'ImplementationThe application is contained in a single python fileAcciPy.pyand can be added using the standard pythonimport. The module itself contains theCAROLQueryclass, which is used by the application to interact with the CAROL database, along with the standardqueryfunction, which takes in a set of rules to query CAROL with..Multiprocessing and performance limitationsQueries resulting in 3500 accidents or more can take over 60 seconds to return, causing the http request to time out. To avoid this, AcciPy breaks up your large queries into smaller ones that the NTSB servers can handle without error to ensure that you aren't left hanging with a 504 server timeout!The application leverages multiprocessing to take advantage of the user's hardware. Because of this, performance is dependent on the number of CPUs on the user's machine. However, it is important to note that the speed of the application is limited by the speed and concurrency level of the NTSB servers. For large queries (yielding over 150000 accidents), please allow up to 1 hour for all data to be transferred. Queries yielding under 3500 datapoints will be completed in under 60 seconds. In general, completion time for queries is proportional to the number of resulting accident datapoints.Quick StartBelow is a small script demonstrating different queries that can be processed using thequeryfunction. For a full list of available queries, take a look at thequery_optionsfile.import AcciPy # Different types of queries q1 = ("engine power", "Narrative", "Factual", "contains") q2 = ("Factual Narrative", "does not contain", "alcohol") q3 = "01/01/2000" q4 = ("fire",) # A query can take any number of rules, as long as it is input as a string or tuple. # These tuples can be in any order, the query function will sort and structure it. AcciPy.query(q1) AcciPy.query(q3, q4) AcciPy.query(q2, q1, q3)AcciPy LibraryThe AcciPy library contains theCAROLQueryclass, thequeryfunction, and other helper functions used to sort input parameters into their respective fields.query()Thequery()function is the main workhorse of the AcciPy library. It takes an arbritrary number of different arguments and converts them into query rules, which it then uses to create a CAROLQuery object to probe the CAROL database. A single argument is formatted as either a string or a tuple of strings, which are sorted into rules using helper functions. For query fields that are missing from an argument, the application uses the existing elements along with a dictionary of known values to fill in the missing values. If the program cannot decide which fields fit the existing arguments, it raises an exception and halts the program. These arguments can contain key words and dates. For a full list of available queries, take a look at thequery_optionsfile.Date queriesDate queries can be submitted with just the date. This will search the database for records with a dateon or afterthe entered date. Examples of some valid and invalid singular date queries are shown below:# Valid date query q = "September 27, 2023" q = "27 Sep 2023" q = "09/27/2023" (mm/dd/yyyy) q = "27/09/2023" (dd/mm/yyyy) q = "2023/09/27" (yyyy/mm/dd) q = "9/27/23" (m/d/yy) q = "27/9/23" (d/m/yy) q = "2023/9/27" (yyyy/m/d) q = "2023-09-27" q = "2023.09.27" # Invalid date query q = "today" q = datetime.today()1 element queriesAll 1 element queries will be considered valid. AcciPy will first attempt to parse the query string as a date (e.g. as shown above). If unable to parse as a date, AcciPy will search the Factual Narrative for the information. Examples of non-date 1 element queries are shown below:q = "engine power" q = ("fire",)2 element queries2 element queries will be considered invalid.#Invalid 2-rule query q1_2 = ("engine loss", "08/29/2005")3 element queriesEach element in 3 element queries will be checked to see if it matches one of the valid inputs, and AcciPy will attempt to sort it into a rule if possible. The following queries are examples of valid 3 element queries:q = ("Factual narrative", "does not contain", "fuel exhaustion") q = ("contains", "student", "analysis narrative")4 element queries (recommended)4 element queries provide the most robust querying in AcciPy and are the recommended way to query. Elements can be placed into the query object in any order, and AcciPy will sort the elements into the proper category for you. A list of all valid 4 element queries can be found in thequery_optionsfile.q = ("03/31/1990", "EventDate", "is after", "Event") q = ("false", "AmateurBuilt", "is", "Aircraft") q = ('Event', 'ID', 'is less than', '3334')Combining queriesEach distinct query is represented as a single string or a tuple of strings, separated by commas. Combining two queries is done by having separate tuples or strings, not combining the queries into one tuple. Valid and invalid multi-rule queries are shown below. You can combine any number of queries.#Valid 2-rule combined query q1 = "engine loss" q2 = ("12/10/2010", "EventDate", "is before", "Event") query(q1, q2) #Valid 5-rule combined query q1 = "alcohol" q2 = "08/29/2005" q3 = ("contains", "student", "factual narrative") q4 = ("12/10/2015", "EventDate", "is before", "Event") q5 = ("false", "AmateurBuilt", "is", "Aircraft") query(q1, q2, q3, q4, q5)Combining queries withAndandOrand therequire_allkey word argumentBy default in AcciPy, queries like the above examples will be combined withandlogic. This means that the queryq1 = "engine loss" q2 = ("12/10/2010", "EventDate", "is before", "Event") query(q1, q2)is equivalent to searching for "engine loss" in the factual narrativeandan event date that is before 12/10/2010. There is an optional key word argument for combined queries calledrequire_all. If you want to search for "engine loss" in the factual narrativeoran event date that is before 12/10/2010 you would do so as follows:q1 = "engine loss" q2 = ("12/10/2010", "EventDate", "is before", "Event") query(q1, q2, require_all=False)Downloading data and thedownloadkey word argumentBy default in AcciPy, queried data is not downloaded. However, you can choose to download data on a query-by-query basis by setting the download key word argument to True. This would look like the following:q1 = "engine loss" q2 = ("12/10/2010", "EventDate", "is before", "Event") query(q1, q2, download=True)Segments of the requested data will be downloaded to ./output/['{query info here}'] until AcciPy has finished downloading all data. Then, all data will be collected in the file ./output/aggregated_data.csv
acclaim-badges
Issue badges from Acclaim upon edx course completion.Acclaim Badges for EDXOverviewAdds a djangoapp to edx which provides a UI and API backend into Acclaim. Once installed, EDX adminstrators will be able to add Acclaim auth tokens and select badges to be issued upon course completion. This app then listens for course complete events, and issues badges if the student obtains a passing score.InstallInstall “acclaim_badges” using pip:pip install acclaim_badgesAdd “acclaim_badges” to your INSTALLED_APPS setting for EDX lms like this:INSTALLED_APPS = [ ... 'acclaim_badges', ]Note: this file is usually located at/edx-platform/lms/envs/common.pyInclude the acclaim_badges URLconf in your project urls.py like this:urlpatterns += ( url(r'^acclaim/', include('acclaim_badges.urls')), )The authorization token field is encypted. create a AES-256 keyset using keyzar:$ mkdir fieldkeys $ keyczart create --location=fieldkeys --purpose=crypt $ keyczart addkey --location=fieldkeys --status=primary --size=256Add keyset location to/edx-platform/lms/envs/common.py:ENCRYPTED_FIELDS_KEYDIR = '/path/to/fieldkeys'Run./manage.py lms syncdb--settingsawsto create the acclaim_badges lms app.UsageThe following useful URLs are made available after installation:/acclaim/tokens//acclaim/badge-courses/Add Acclaim organization and authorization token using/acclaim/tokens/Define a mapping between badge and course by accessing/acclaim/badge-courses/Note: when defining a mapping, the dropdown will populate with badge templates if the Acclaim API call is successful (valid token and orgainzation combination are used).DocumentationThe full documentation is athttps://acclaim-badges.readthedocs.org.LicenseThe code in this repository is licensed under the AGPL 3.0 unless otherwise noted.Please seeLICENSE.txtfor details.How To ContributeContributions are very welcome.Please readHow To Contributefor details.Even though they were written withedx-platformin mind, the guidelines should be followed for Open edX code in general.PR description template should be automatically applied if you are sending PR from github interface; otherwise you can find it it atPULL_REQUEST_TEMPLATE.mdIssue report template should be automatically applied if you are sending it from github UI as well; otherwise you can find it atISSUE_TEMPLATE.mdReporting Security IssuesPlease do not report security issues in public. Please [email protected] HelpHave a question about this repository, or about Open edX in general? Please refer to thislist of resourcesif you need any assistance.Change LogUnreleased[0.1.0] - 2017-05-10AddedFirst release on PyPI.
acclaimbadge-xblock
No description available on PyPI.
accli
Accelerator terminal CLI and python APIUser GuideRequirementsPython >=3.7.17Installationpip install accli --userUsage as modulepython -m accliUsage as executableYou might receive following similar warning during installationWARNING: The script accli.exe is installed in 'C:\Users\singhr\AppData\Roaming\Python\Python311\Scripts' which is not on PATH. Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.You could also add executable directory path in PATH environment variable. Please follow following links for instruction on adding executable directory path to PATH environemnt variable.Updating PATH on windowsUpdating PATH on linuxCommandaccli --helpOutputUsage: accli [OPTIONS] COMMAND [ARGS]...Note: You may need to prepend the command with either./(in linux) or.\(in winodws).Developer GuideGeneral build and upload instructionsPlease followthis link.Release processCommit with right version on accli/_version.pyRun 'python scripts/tag.py'python -m build
acc-lib
Acc LibAccelerator physics library containing plotting tools and other useful methods to handle data from X-suite tracking and other beam dynamics software, in constant evolution along with my PhD projects.Installation can be easily done via:pip install acc_libThe classplot_toolscontains several plotting methods to illustrate the turn-by-turn data generated by X-suite trackers, including centroid evolution, phase space distributions, Twiss parameters, simple FFT and tune footprint.The classresonance_lines, based upon the base class from Foteini Asvesta, contains methods to plot resonance lines of a given order in a given interval for tune plots. In particular, the methodplot_resonance_and_tune_footprintallows to print the tune footprint from X-suite tracking together with the resonance lines of a given order.The classparticlesallows to display particle properties.The classmadx_toolscontains methods to print errors from a crashed MAD-X process, print the elements neatly, plot the beam envelope, get the realistic apertures for both starting and end points of each element, and to plot the aperture together with the beam envelope.The classfootprint, inspired bythis X-suite example, allows for convenient plotting of footprints of coordinates in Cartesian/polar, but also of the tunes.
acclimatise
For the full documentation, refer to theGithub Pages Website.Acclimatise is a Python library and command-line utility for parsing the help output of a command-line tool and then outputting a description of the tool in a more structured format, for example aCommon Workflow Language tool definition.Currently Acclimatise supports bothCWLandWDLoutputs, but other formats will be considered in the future, especially pull requests to support them.ExampleLets say you want to create a CWL workflow containing the common Unixwc(word count) utility. Runningwc--helpreturns:Usage: wc [OPTION]... [FILE]... or: wc [OPTION]... --files0-from=F Print newline, word, and byte counts for each FILE, and a total line if more than one FILE is specified. A word is a non-zero-length sequence of characters delimited by white space. With no FILE, or when FILE is -, read standard input. The options below may be used to select which counts are printed, always in the following order: newline, word, character, byte, maximum line length. -c, --bytes print the byte counts -m, --chars print the character counts -l, --lines print the newline counts --files0-from=F read input from the files specified by NUL-terminated names in file F; If F is - then read names from standard input -L, --max-line-length print the maximum display width -w, --words print the word counts --help display this help and exit --version output version information and exit GNU coreutils online help: <http://www.gnu.org/software/coreutils/> Full documentation at: <http://www.gnu.org/software/coreutils/wc> or available locally via: info '(coreutils) wc invocation'If you runacclimatise explore wc, which means “parse the wc command and all subcommands”, you’ll end up with the following files in your current directory:wc.cwlwc.wdlwc.ymlThese are representations of the commandwcin 3 different formats. If you look atwc.wdl, you’ll see that it contains a WDL-compatible tool definition forwc:version 1.0 task Wc { input { Boolean bytes Boolean chars Boolean lines String files__from Boolean max_line_length Boolean words } command <<< wc \ ~{true="--bytes" false="" bytes} \ ~{true="--chars" false="" chars} \ ~{true="--lines" false="" lines} \ ~{if defined(files__from) then ("--files0-from " + '"' + files__from + '"') else ""} \ ~{true="--max-line-length" false="" max_line_length} \ ~{true="--words" false="" words} >>> }
accloudtant
UNKNOWN
accmon
AccMon is a monitoring middleware for django, using FodtlMon a monitor based on distributed first order linear temporal logic. It allows to monitor formula defined on HTTP traffic, views processing and external tools via plugins. Note that this framework is a research prototype and should not be used in production !
accolade
No description available on PyPI.
accoladecli
No description available on PyPI.
accoladepraccli
No description available on PyPI.
accolades
Example PackageThis is a simple example package. You can useGithub-flavored Markdownto write your content.
accomate
No description available on PyPI.
accompanist
🎻 IntroductionWhat is Accomapnist?Accompanist is an AWS WAF log analysis tool and an "accompanist" of WAF operation.It enables you to analyze AWS WAF logs and automatically generate instant analysis reports in just three simple steps.If you're operating AWS WAF, this tool can be helpful for instant log analysis and help you understand the trends of blocked or counted accesses.Reporting ItemsHistgram of requestsTotal numberNumber by each WAF rule providerTop 5 for the following categories:Rule groupRulePaths nameUser agentIP addresswsCountry codesHit count on specific pathsCommentAn optional field where you can write additional notesPrerequisiteRequirementWAF: AWS WAFv2 and/or Third Party WAF (using WebACL)Logging Destination: CloudWatch Logs or S3Support Action: BLOCK, COUNT or EXCLUDED_AS_COUNTIAM Role/User policyExample policies are noted in Appnedix, the last section of this page.🚀 Get StartedInstall$pipinstallaccompanistUsageFirst, configure the settings using the following command; this will generate aconfig.jsonfile.$accompanistinitYou must set the log group name using this command, or you can choose to edit theconfig.jsonfile directly.Second, retrieve the query results with the following commnad.For example, to retrieve theBLOCKlog for the past three days$accompanistlisten-aBLOCK-d3If the log destination is S3, you can use "hear" instead of the "listen".Finally, generate a report in PDF format.$accompanistplayUninstall$pipuninstallaccompanist🌱 How to useConvenient usageThe "listen" or "hear" with--json_logor-joption can be help to investigate the detail of WAF logs.[Important] Please handle it carefully as it may include sensitive or confidential information.If you want to confirm a specific access of log, you should re-run "play" with the option and open a json file generated in the current directory.The report color is customizable for when you run "play" with the option,--coloror-d.PrecautionThis tool works with AWS WAFv2 and does not support WAFv1.For AWS Managed Rules, you need to append "AWS" to the prefix, or the histogram works wrong.In COUNT action, as it is a well-known general mode of AWS WAF, COUNT or EXCLUDED_AS_COUNT is used.Using the "COUNT" action, the results are double counted by each rule.The logs may not be correctly calculated if you use the "Label" features in WAF.If the log destination is S3, it will take some time to acquire the log (several minutes)The analysis target period (days) can be set up to 40 days at most.🔨 SubcommnadsinitConfigure a CWL log group setting and more.Usage:accompanistinit# interactive inputting processOptions:-l,--log-groupTEXTSetaCloudWatchLogsLoggroupname.[required]-p,--pathTEXTSetaURIpathforcountsthatisblocked/counted.[required]-c,--commentTEXTSetacommentforreport.[required]-h,--helpShowthismessageandexit.listenGet a WAF log file in CSV format.Usage:accompanistlisten[OPTIONS]Options:-a,--action[BLOCK|COUNT]ChoseanactiontypeofAWSWAF.Thedefaultis"BLOCK".[required]-d,--daysINTEGERSetanumberofthepastdaysuntiltodayforanalysistargetperiod.-s,--start_timeINTEGERSetaUNIXtimeoftheoldesttimeforanalysistargetperiod(insteadof"--days").-e,--end_timeINTEGERSetaUNIXtimeofthelatesttimeforanalysistargetperiod(insteadof"--days").-j,--json-logOutputaJSONlogfile.Pleasehandleitwithcare.fordebugging.-h,--helpShowthismessageandexit.hearNote: This is experimental feature.Get a WAF log file in CSV format from S3.Usage:accompanisthear[OPTIONS]Options:-a,--action[BLOCK|COUNT]ChoseanactiontypeofAWSWAF.Thedefaultis"BLOCK".[required]-d,--daysINTEGERSetanumberofthepastdaysuntiltodayforanalysistargetperiod.-s,--start_timeINTEGERSetaUNIXtimeoftheoldesttimeforanalysistargetperiod(insteadof"--days").-e,--end_timeINTEGERSetaUNIXtimeofthelatesttimeforanalysistargetperiod(insteadof"--days").-h,--helpShowthismessageandexit.playAnalysis WAF logs and generate a report.Usage:accompanistplay[OPTIONS]Options:-c,--colorfulSetarandomcolorofreporttheme(insteadofcolor).-d,--colorTEXTCustomizeacolorofreportthemewithcolorcode,(e.g.)#cccccc.-m,--mask-ipMaskIPaddressesonpiechart.-u,--utc-offsetINTEGERSetanumberofUTCoffest.ThedefautoffsetisUTC+9(Asia/Tokyo).-y,--y-limit[50|100|500|1000]AdjustaY-axismaxlimitationforhistogramsduetomanyrequests.-h,--helpShowthismessageandexit.📚 AppendixIAM Policy for CWLHere is an example of IAM policy with restricted permissions.{"Version":"2012-10-17","Statement":[{"Effect":"Allow","Action":"logs:StartQuery","Resource":["arn:aws:logs:<region>:<aws-acount>:log-group:<log-group-name>:*"]},{"Effect":"Allow","Action":"logs:GetQueryResults","Resource":"*"}]}Note: The<region>,<aws-acount>and<log-group-name>in this sample policy should be replaced.IAM Policy for S3Here is an example of IAM policy with restricted permissions.{"Version":"2012-10-17","Statement":[{"Effect":"Allow","Action":"s3:ListBucket","Resource":"arn:aws:s3:::<waf-log-bucket>"},{"Effect":"Allow","Action":"s3:GetObject","Resource":"arn:aws:s3:::<waf-log-bucket>/*"}]}Note: The<waf-log-bucket>in this sample policy should be replaced.LicenseThe MIT LicenseCopyright 2023 Itsuki YutakaPermission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
acconeer-exptool
Acconeer Exploration ToolExplore the Next Sensewith Acconeer Exploration Tool! Use one of ourevaluation kitstogether with our Python examples and start exploring the world of Acconeer's radar sensor technology. The Python scripts and the Application in this repository will help you to easily stream the radar sensor's data to your local machine to start radar sensor evaluation and/or algorithm development for your application.To run the Python exploration scripts, you will need anevaluation kitrunning the included Exploration or Module server, both of which are supplied with theAcconeer SDK and Module SWimage.This release is developed forAcconeer SDK and Module SWA111-v2.15.2andA121-v1.5.0. Running this version is strongly recommended, as we continuously fix bugs and add features.Quickstart for WindowsThere is a portable version of the Acconeer Exploration Tool for Windows:Downloadthe zip file and extractDouble click theupdate.batfile and wait for the installation to finish, which might take a couple of minutesDouble click therun_app.batFor an in-depth evaluation, we recommend a full installation as described below.DocumentationDocumentation is available atdocs.acconeer.com, where you can also find theFAQ.Newsv7.4.0 released. See theChangelog.Setting up your evaluation kitXM126XM125XC120 + XE121 (A121)Raspberry Pi (A121 on XE121)Raspberry Pi (A111 on XC111+XR111 or XC112+XR112)XM112XM122XM132For additional resources, head over to theAcconeer developer page. There you will find both a getting started guide and a video showing you how to set up your evaluation kit. There you will also find the SDK download.Setting up your local machineRequirementsPython 3.8 or newer. Older versions have limited or no support.Tested on:Python 3 (developed and tested on 3.8, 3.9, 3.10, 3.11 and 3.12)Windows 10Ubuntu 20.04SetupInstalling theacconeer-exptoolpackageInstall from PyPI:python -m pip install --upgrade acconeer-exptool[app]Depending on your environment, you might have to replacepythonwithpython3orpy.For other options, have a look atdocs.acconeer.com.Windows COM port driversIf no COM port is recognized when plugging in a module, you might need to install a driver. See information about your specific module atdocs.acconeer.comLinux setupAfter installing theacconeer-exptoolpackage, you can runpython -m acconeer.exptool.setupDepending on your environment, you might have to replacepythonwithpython3orpy.which interactively configures your machine and downloads needed dependencies. This is done in order for your machine to work at its best with Exploration tool.acconeer.exptool.setupperforms the steps described in theLinux setup section on docs.acconeer.com.ApplicationUsing the application is the easiest way to start exploring Acconeer's radar sensor and our application examples:python -m acconeer.exptool.appDepending on your environment, you might have to replacepythonwithpython3orpy.In the top right box of the application, namedConnection, select the interface you wish to useSPI: auto-detects an XM112 connected to USB2 (USB1 is also needed for power)Socket: specify the IP address of your Raspberry Pi running the streaming serverSerial: specify the serial port that is assigned to the sensorConnections viaSerialhave the option of choosing aProtocol. The choices areModuleandExploration, where the protocol should match the server installed on the module (Module serverorExploration server, respectively). Choosing the wrong protocol will show an error.After pressingConnect, a connection should be established. In the box below labelledScan controls, select the service or processing example you want to test. Now you may tune the sensor and processing settings to your specific setup. Once you pressStart measurement, the application will start fetching data from the sensor and plotting the results. After pressingStop, you can save (and later load data) or just replay the data stored in the buffer.The ML interface(no longer supported)Support for the Machine Learning interface in Exploration Tool has been dropped.If you still need to use it, it is possible to use an old version of Exploration Tool.From theacconeer-python-explorationdirectory:git checkout v3And follow the instructions in an old version of this document (README.md).Note that this version of Exploration Tool will not be actively supported. Compatibility with new RSS versionsis not guaranteed.Running an example script on your local machineIf you prefer using the command line for testing and evaluation of our examples you can use the following instructions.XC111+XR111 or XC112+XR112 (mounted on a Raspberry Pi):python examples/a111/basic.py -s <your Raspberry Pi IP address>XM112+XB112 via SPI over USB:python examples/a111/basic.py -spiAny module via UART over USB, attempting to auto-detect the serial port:python examples/a111/basic.py -uAny module via UART over USB, using a specific serial port:python examples/a111/basic.py -u <the serial port, for example COM3>Depending on your environment, you might have to replacepythonwithpython3orpy.Choosing which sensor(s) to be used can be done by adding the argument--sensor <id 1> [id 2] .... The default is the sensor on port 1. This is not applicable for the modules.Scripts can be terminated by pressing Ctrl-C in the terminal.DisclaimerHere you find thedisclaimer.FAQ and common issuesSee theFAQon the Acconeer documentation pages.
accorder
Accorder takes care of various tasks which Memory of the World amateur librarians do in order to maintain their shared catalogs online. It builds searchable, standalone, portable webapp which one could then just copy to USB disk and open BROWSE_LIBRARY.html in her web browser. It uploads all of the books and metadata from local Calibre’s library (together with portable webapp) to the server. It helps a librarian to maintain and share her catalog athttps://library.memoryoftheworld.orgtogether with other amateur librarians. It does all of above in one go by typing: accorder release PROFILE. The configuration file will keep information about one or more PROFILE. Under every PROFILE’s configuration section there will be information about the directory path of local Calibre’s library, librarian’s name, credentials needed to upload the files to the destination server etc.
accordian
Event dispatch in Python 3.8 using asyncioInstallationpip install accordianGetting Startedimport asyncio from accordian import signal my_event = signal("my_event") @my_event.connect async def pow(x, y): await asyncio.sleep(0.1) print(f"pow({x}, {y})") return x ** y @my_event.connect async def diff(x, y): await asyncio.sleep(0.2) print(f"diff({x}, {y})") return max(x, y) - min(x, y) # create events without blocking async def create(): my_event.send(4, 6) my_event.send(0, 1) asyncio.run(create()) # or block and collect results from all receivers async def collect(): results = await my_event.join(10, 3) assert set(results) == {1000, 7} asyncio.run(collect()) # signals without receivers return no results async def empty(): sig = signal("unknown") results = await sig.join(1, "foo", None) assert results == [] asyncio.run(empty())NamespacesBy default,accordian.signalcreates signals in a global namespace. You can create your own namespaces to group signals together. Here, a processor is passed the region and stage to create deployment tasks:from accordian import Namespace regions = {"east": Namespace(), "west": Namespace()} @regions["east"].signal("dev").connect async def deploy_east_dev(s3_url, creds): ... @regions["east"].signal("prod").connect async def deploy_east_prod(s3_url, creds): # remove pre-prod feature flags await sanitize_prod(s3_url, "east") ... @regions["west"].signal("prod").connect async def deploy_west_prod(s3_url, creds): # legacy region shims await patch_west_bundle(s3_url) await sanitize_prod(s3_url, "west") ... async def deploy(region, stage): s3_url = await bundle_for_region(region, stage) creds = await creds_for_region(region, stage) signal = regions[region].signal(stage) # create the deployment task without waiting signal.send(s3_url, creds) # create deployment tasks asyncio.run(deploy("east", "dev")) asyncio.run(deploy("west", "prod")) # wait for deployments to complete async def wait_for_tasks(): running = asyncio.all_tasks() await asyncio.wait(running) asyncio.run(wait_for_tasks())ContributingContributions welcome! Please make suretoxpasses before submitting a PR.DevelopmentTo set up a virtualenv and run the test suite:git clone https://github.com/numberoverzero/accordian.git make venv make
accordion
AccordionAboutInstallationExampleRequirementsContribution how-toAboutMake flat dict and back fromdictInstallationWith pip:pipinstallaccordionExamplefromaccordionimportcompress,expanddata={"a":[1,2,3],"b":{"c":"d"}}expected={"a/0":1,"a/1":2,"a/2":3,"b.c":"d"}assertcompress(data)==expectedassertexpand(compress(data))==dataRequirementsTested withpython3.6Contribution how-toRun tests:clone repo:git clone <your-fork>create and activate your virtualenvpip install -r requirements.txt && pip install -r dev-requirements./run_tests.sh
accordion_presentation
It is a simple horizontal (cycle2) accordion for django cms 3.0.6 and django 1.7.InstallationInstall from pypi$pipinstallaccordion_presentationor clone from with git$gitclonehttps://github.com/luisza/accordion_presentation.git$cdaccordion_presentation$pythonsetup.pyinstallSetupAnd put in your appsINSTALLED_APPS=(...'accordion_presentation','paintstore')Run migrate$pythonmanager.pymigrate
accordion-project
No description available on PyPI.
accord-nlp
NLP Framework
accordo
Deep Learning experiment orchestration tool
accord.py
Accord.pyTesting library for the discord.py libraryWork in progress, better documentation coming
account
No description available on PyPI.
accountability-model
No description available on PyPI.
accountable
Never leave the command line to update a ticket again.QuickstartInstallation:pip install accountableOnce installed, configure your account:accountable configureList all projects:accountable projectsList all issue types:accountable issuetypesoraccountable issuetypes DEVList metadata for an individual issue:accountable issueDEV-101Add a comment to an issue:accountable issueDEV-102addcomment"[~tpm]I'm BLOCKED"List available transitions for an issue:accountable issueDEV-103transitionsDo a transition for an issue:accountable issueDEV-104dotransition 1ConfiguringCurrently, only Basic Auth is supported. Runningaccountable configurewill prompt you to enter your username, password, and the your Jira domain.Since every account can be setup differently you might want to view custom fields for an issue. By default, the following fields are displayed when examining an issue:Reporter - Display nameAssignee - Display nameIssue type - nameStatus - Status category - nameSummaryDescriptionThese defaults can be changed by editing your~/.accountable/config.yaml. Nesting fields is supported. Check out the Jira documentationherefor information on fields in the payload.Using with GithooksTODOUsing with popodoroOAuthWhy?Jira already supports robust triggers, like changing a ticket’s status based on a pull request, or a branch being created. You can also transition tickets with commit messages.However, there are times when these automated triggers aren’t enough.Often, you’ll start work locally, and forget to put the ticket in progress. Or you’ll forget to add a transition to a commit message. Multiple actions listed in your commit message also aren’t relevant to the project’s history.
accountant
No description available on PyPI.
account-eth
eth-accountSign Ethereum transactions and messages with local private keysRead more in thedocumentation on ReadTheDocs.View the change log.Quickstartpipinstallaccount-ethDeveloper SetupIf you would like to hack on eth-account, please check out theSnake Charmers Tactical Manualfor information on how we do:TestingPull RequestsCode StyleDocumentationDevelopment Environment SetupYou can set up your dev environment with:[email protected]:ethereum/account-eth.gitcdaccount-eth virtualenv-ppython3venv .venv/bin/activate pipinstall-e".[dev]"To run the integration test cases, you need to install node and the custom cli tool as follows:apt-getinstall-ynodejs# As sudo./tests/integration/ethers-cli/setup_node_v12.sh# As sudocdtests/integration/ethers-cli npminstall-g.# As sudoTesting SetupDuring development, you might like to have tests run on every file save.Show flake8 errors on file change:# Test flake8when-changed-v-s-r-1account_eth/tests/-c"clear; flake8 account_eth tests && echo 'flake8 success' || echo 'error'"Run multi-process tests in one command, but without color:# in the project root:pytest--numprocesses=4--looponfail--maxfail=1# the same thing, succinctly:pytest-n4-f--maxfail=1Run in one thread, with color and desktop notifications:cdvenv ptw--onfail"notify-send -t 5000 'Test failure ⚠⚠⚠⚠⚠' 'python 3 test on eth-account failed'"../tests../account_ethRelease setupFor Debian-like systems:apt install pandocTo release a new version:makereleasebump=$$VERSION_PART_TO_BUMP$$How to bumpversionThe version format for this repo is{major}.{minor}.{patch}for stable, and{major}.{minor}.{patch}-{stage}.{devnum}for unstable (stagecan be alpha or beta).To issue the next version in line, specify which part to bump, likemake release bump=minorormake release bump=devnum. This is typically done from the master branch, except when releasing a beta (in which case the beta is released from master, and the previous stable branch is released from said branch).If you are in a beta version,make release bump=stagewill switch to a stable.To issue an unstable version when the current version is stable, specify the new version explicitly, likemake release bump="--new-version 4.0.0-alpha.1 devnum"
account-generator-helper
AccountGeneratorHelperLibrary to facilitate accounts generation.Unofficial API for temp email services.Receive SMS from free services.Parsing and testing proxies.Free solving regular text captcha.Generate fake person.Generate passwords and etc.ContentsSupported servicesGetting startedUsageTemp email servicesReceive SMSGenerate dataProxy parserCaptcha solvingComing soonSay thank you meSupported servicesServices for temporary mail✅Inbox Kitten✅TempMail +✅TempMail.lol✅TempMail✅GmailNator(Temp gmail email)Services for receiving SMS✅Receive Sms FreeServices for fake data✅RandomuserServices for proxy list✅Proxy List✅SSL Proxies✅Socks Proxy✅Free Proxy List✅ProxyScrape✅HideMy.name✅Advanced✅OpenProxy✅GeoNode✅OpenProxyServices for solving captcha✅OptiicGetting startedThis library tested with Python 3.6+ and Pypy 3. There are two ways to install the library:Installation using pip (a Python package manager):$ pip install account-generator-helperInstallation from source (requires git):$ git clone https://github.com/Dionis1902/AccountGeneratorHelper.git $ cd AccountGeneratorHelper $ python setup.py installor:$ pip install git+https://github.com/Dionis1902/AccountGeneratorHelper.gitIt is generally recommended using the first option.While the library is production-ready, it is still under development, and it has regular updates, do not forget to update it regularly by calling$ pip install account-generator-helper --upgradeUsageTemp email services# Inbox Kittenfromaccount_generator_helperimportInboxKittenmail=InboxKitten()print('Mail :',mail.set_email('test-mail'))# Mail : [email protected]_letterinmail.get_inbox():print('Letter :',_letter)# Letter : (Letter ..)print('Letter content :',_letter.letter)# Letter content : [email protected]_handler()defnew_mail(letter):print('New mail :',letter)@mail.letter_handler(from_email='[email protected]')deftest_from(letter):print('Test from :',letter)@mail.letter_handler(re_subject='.* test .*')deftest_re_subject(letter):print('Test re subject :',letter)@mail.letter_handler(from_email='[email protected]',subject='Test letter')deftest_handler(letter):print('Test handler :',letter)mail.poling()# TempMail +fromaccount_generator_helperimportTempMailPlus,TempMailPlusDomainsmail=TempMailPlus()print('Mail :',mail.set_email('test-mail',TempMailPlusDomains.MAILTO_PLUS))# Mail : [email protected]_letterinmail.get_inbox():print('Letter :',_letter)# Letter : (Letter ...)print('Letter content :',_letter.letter)# Letter content : [email protected]_handler()defnew_mail(letter):print('New mail :',letter)@mail.letter_handler(from_email='[email protected]')deftest_from(letter):print('Test from :',letter)@mail.letter_handler(re_subject='.* test .*')deftest_re_subject(letter):print('Test re subject :',letter)@mail.letter_handler(from_email='[email protected]',subject='Test letter')deftest_handler(letter):print('Test handler :',letter)mail.poling()# GmailNatorfromaccount_generator_helperimportGmailNatormail=GmailNator()print('Mail :',mail.set_email('[email protected]'))# Mail : [email protected]_letterinmail.get_inbox():print('Letter :',_letter)# Letter : (Letter ..)print('Letter content :',_letter.letter)# Letter content : [email protected]_handler()defnew_mail(letter):print('New mail :',letter)@mail.letter_handler(from_email='[email protected]')deftest_from(letter):print('Test from :',letter)@mail.letter_handler(re_subject='.* test .*')deftest_re_subject(letter):print('Test re subject :',letter)@mail.letter_handler(from_email='[email protected]',subject='Test letter')deftest_handler(letter):print('Test handler :',letter)mail.poling()# TempMailLolfromaccount_generator_helperimportTempMailLolmail=TempMailLol()print('Mail :',mail.get_email())# Mail : [email protected]_letterinmail.get_inbox():print('Letter :',_letter)# Letter : (Letter ..)print('Letter content :',_letter.letter)# Letter content : [email protected]_handler()defnew_mail(letter):print('New mail :',letter)@mail.letter_handler(from_email='[email protected]')deftest_from(letter):print('Test from :',letter)@mail.letter_handler(re_subject='.* test .*')deftest_re_subject(letter):print('Test re subject :',letter)@mail.letter_handler(from_email='[email protected]',subject='Test letter')deftest_handler(letter):print('Test handler :',letter)mail.poling()# Temp Mailfromaccount_generator_helperimportTempMailmail=TempMail()print('Mail :',mail.get_email())# Mail : [email protected]_letterinmail.get_inbox():print('Letter :',_letter)# Letter : (Letter ..)print('Letter content :',_letter.letter)# Letter content : [email protected]_handler()defnew_mail(letter):print('New mail :',letter)@mail.letter_handler(from_email='[email protected]')deftest_from(letter):print('Test from :',letter)@mail.letter_handler(re_subject='.* test .*')deftest_re_subject(letter):print('Test re subject :',letter)@mail.letter_handler(from_email='[email protected]',subject='Test letter')deftest_handler(letter):print('Test handler :',letter)mail.poling()Receive SMS# Receive Sms Freefromaccount_generator_helperimportReceiveSms,Countiesphone=ReceiveSms()country=phone.get_country(Counties.POLAND)phone=country.get_number()print('Phone number :',phone.number)# Phone number : 380665327743formessageinphone.get_last_messages():print(message)# (Message ...)Generate data# Generate fake personfromaccount_generator_helperimportgenerate_person,generate_personsprint(generate_person())# Person(gender='female', nam...)print(generate_persons(10))# [Person(gender='female', nam...), Person(gender='female', nam...), ...]# Utilitiesfromaccount_generator_helperimportget_password# Generate passwordprint(get_password())# i)7\\yc4EsvTQJG'print(get_password(numbers=False))# a<}>?;xZr!Ne{^^Hprint(get_password(special_symbols=False))# vX12FgcJ7PYwA3tnprint(get_password(upper_case=False))# ](}kh()|9~t(":4$print(get_password(upper_case=False,numbers=False,special_symbols=False))# mppimpgxchlznwmmProxy parser# Proxy parsingfromaccount_generator_helperimportProxiesproxies=Proxies()proxies.parse_proxies()print(proxies)# (Proxies proxies_count=11572)print(proxies.pop())# (Proxy proxy_type=HTTP address=203.23.106.209 port=80 country=Counties.CYPRUS)print(proxies.pop().strfproxy())# http://203.32.121.187:80Captcha solving# Solving regular text captchafromaccount_generator_helperimportCaptchaSolver# Get api key from https://optiic.dev/captcha_solver=CaptchaSolver('11r6wjas2zTHLTgdWvEjaap1xq7m7111ufUNFas1fwCS')print('Captcha 1 result :',captcha_solver.solve(open('images/captcha_1.png','rb')))# 97823Cprint('Captcha 2 result :',captcha_solver.solve(open('images/captcha_2.png','rb')))# 8CCPXPprint('Captcha 3 result :',captcha_solver.solve(open('images/captcha_3.png','rb')))# NRGFHGComing soonreCAPTCHA solverhCaptcha solverFunCaptcha solverBetter text captcha solverAdd more emails and receiving SMS servicesBetter fake person generator, with more data (credit card, bio, photo etc)Simple account generator (Steam, Outlook etc)Say thank you meUSDT (ERC20) : 0xB8314551f0633aee73f93Ff4389629B367e59189USDT (TRC20) : TYJmX4R22NmSMBu7HWbwuwRr7TW9jN5az9BTC : bc1q3jgp25rc8qtzx0fwd9ltpy45yv05hphu7pvwlaETH : 0xB8314551f0633aee73f93Ff4389629B367e59189BNB (Smart Chain) : 0xB8314551f0633aee73f93Ff4389629B367e59189
accountifie
No description available on PyPI.
accounting
version number: 0.0.1 author: Bernard OjengwaOverviewaccounting is a Python package providing simple and advanced number, money and currency formatting.Installation / UsageTo install use pip:$ pip install accountingOr clone the repo:$ git clonehttps://github.com/ojengwa/accounting.git$ python setup.py installContributingTBDExampleTBD
accountingkits
Accounting KitsRecommend CONDA manage the dependencies.This is a self-made package which target is help to deal with different problems in accounting research.WARNING: This version is Still PREVIEW and UNSTABLE! ANY functions and classes COULD BE CHANGED (NAMES OR OTHERS) IN FUTURE!1. Setup the package:I recommend to install the package by conda-forge, or may cause error:In rolling version(How ever the package itself would be stop from have dependencies), because I use conda to manage my package, but pip always try to corrupt my environment.This make me could not put these following dependencies in setup.cfg, If there found any method to help me to manage the environment while could help you with dependencies, feel free to tell me, it is the first time for me to write the python package, thank you.Here are Conda install packages:certifi numpy pandas pathos requests python-Levenshtein thefuzz rapidfuzz sas7bdat nltk beautifulsoup4 fake-useragent Cython wordninjaYou may install these special dependencies from pip only:ratelimit backoffCheck that some of other pack are recommend but not need:selenium (pip)then clone->installgitclonehttps://github.com/qihangZH/accountingkits.gitcdaccountingkits pipinstall.If developing need:#IF DEVELOPINGpythonsetup.pydevelop2. How if I need to use the single module But I find it use other modules?Nice question, If really so, you may have to replace the code for single modules sometimes only uses some _BasicFuncs functions.For example in FuzzyMatchT.py:from..import_BasicToolsTo search in FuzzyMatch.py,you can find that, "_BasicFunc" result contains:withpathos.multiprocessing.Pool(# for safer exception in multiprocessinitializer=_BasicFunc.MultiprocessF.processes_interrupt_initiator)aspool:...And the only function could be found:defthreads_interrupt_initiator():"""Each pool process will execute this as part of itsinitialization.Use this to keep safe for multiprocessing...and gracefully interrupt by keyboard"""signal.signal(signal.SIGINT,signal.SIG_IGN)To replace it, you can directly put it in your need module and add some your own codes,however, I could not premise the _BasicFunc will not be refactored in future version.# copy heredefthreads_interrupt_initiator():"""Each pool process will execute this as part of itsinitialization.Use this to keep safe for multiprocessing...and gracefully interrupt by keyboard"""signal.signal(signal.SIGINT,signal.SIG_IGN)withpathos.multiprocessing.Pool(# for safer exception in multiprocessinitializer=threads_interrupt_initiator)aspool:...3. Deprecation and Future WarningsAny changes which cause Deprecation and Future Warnings will be placed here, mostly they cause version error.If not, kindly send me email and I will show it in README.However, Deprecation and Future warnings are unavailable for Preview/alpha/beta version4.ReferencesThis project includes code from thehttps://github.com/r-boulland/Corporate-Website-Disclosure, which is licensed under the MIT license. Precisely, the accountingkits/CrawlerApi/WayBackT.py The full text of the MIT license can be found in the CrawlerT/LICENSE file.
accountingModules
UNKNOWN
accounting-vasco
No description available on PyPI.
accounting-vod
No description available on PyPI.
account-management-sdk
Manage user subscriptions and clusters # noqa: E501
accounts
No description available on PyPI.
accounts-shared
No description available on PyPI.
accountsSSO
shiboken 1.0.4pyside qt4.7+1.0.4accounts.qtsignon-qtAccountSetup
account-utils
No description available on PyPI.
accoutning-api-py
No description available on PyPI.
accp
Audio, Caption Crawler and ProcessorDownloads and processes the audios and captions(subtitles) from Youtube videos for Speech AIRequirementsCurrently requires python >= 3.6FFmpegTo Usefrom accp import ACCP playlist_name="" playlist_url = "" accp = ACCP(playlist_name, playlist_url) accp.download_audio() #download audio from youtube accp.download_caption() #download captions from youtube accp.audio_split() #splitResultsdatasets |- playlist name |- metadata.csv |- alignment.json |- wavs ├── 1.wav ├── 2.wav ├── 3.wav └── ...andmetadata.csvshould look like:{ 0001.wav|그래서 사람들도 날 핍이라고 불렀다., 0002.wav|크리스마스 덕분에 부엌에 먹을게 가득했다., 0003.wav|조가 자신이 그 사람이라고 나섰다., ... }andalignment.jsonshould look like:{ "./datasets/playlist name/wavs/0001.wav": "그래서 사람들도 날 핍이라고 불렀다.", "./datasets/playlist name/wavs/0002.wav": "크리스마스 덕분에 부엌에 먹을게 가득했다.", "./datasets/playlist name/wavs/0003.wav": "조가 자신이 그 사람이라고 나섰다.", }
accphys
No description available on PyPI.
acc-provision
No description available on PyPI.
accpy
##本程式提供多種以函式設計的會計應用功能,由台灣.屏東大學.周國華老師設計###BondEntry(a,b,c,d,e)BondEntry函式在產生公司債發行及付息還本的攤銷分錄,參數a是公司債票面總金額,參數b是公司債票面年利率(例如,3.5%,要輸入3.5),參數c是公司債市場有效利率(例如,3.75%,要輸入3.75),參數d是公司債每年付息次數,參數e是公司債存續年數。###BondPrice(a,b,c,d,e)BondPrice函式在計算公司債發行總價格,參數a是公司債票面總金額,參數b是公司債票面年利率(例如,3.5%,要輸入3.5),參數c是公司債市場有效利率(例如,3.75%,要輸入3.75),參數d是公司債每年付息次數,參數e是公司債存續年數。###BreakEven(a,b)BreakEven函式在計算多種產品組合下的損益兩平銷售金額,參數a是個案公司固定成本總額,參數b是個案公司銷售的產品共有幾種。###Confucious()Confucious函式提供一則關於至聖先師孔子與會計的小故事。###DDB(a,b,c,m)DDB函式(DDB是double declining balance的簡稱)在產生倍數餘額遞減法下各期的折舊分錄,參數a是設備資產的購置成本,參數b是設備資產的估計耐用年限,參數c是設備資產的估計殘值,參數m是直線法折舊率的倍數,通常是2或1.5倍。###FPDB(a,b,c)FPDB函式(FPDB是fixed percentage of declining balance的簡稱)在產生定率遞減法下各期的折舊分錄,參數a是設備資產的購置成本,參數b是設備資產的估計耐用年限,參數c是設備資產的估計殘值。###isave(a,b,c)isave函式(isave是InstallmentSavings的簡稱)在計算以每月存款方式進行零存整付的到期本利合,參數a是每月存款金額,參數b是存款次數(例如一年期就是12次),參數c是存款的年利率(例如,3.5%,要輸入3.5)。###LoanPayment(a,b,c)LoanPayment函式在計算銀行貸款每月應償付之本息金額,參數a是貸款總金額,參數b是貸款總年數,參數c是貸款年利率(例如,3.5%,要輸入3.5)。###SLN(a,b,c)SLN函式(SLN是straight line method的簡稱)在產生直線法下各期的折舊分錄,參數a是設備資產的購置成本,參數b是設備資產的估計耐用年限,參數c是設備資產的估計殘值。###SYD(a,b,c)SYD函式(SYD是sum of years' digits的簡稱)在產生年數合計法下各期的折舊分錄,參數a是設備資產的購置成本,參數b是設備資產的估計耐用年限,參數c是設備資產的估計殘值。###YuTheGreat()YuTheGreat函式提供一則治水很厲害的大禹和會計的小故事。
accredidact-downloader
Download AccreDidact in PDF formatA command line tool and library to download AccreDidact in PDF formatIntroductionAccreDidactis a Dutch magazine for continuing medical education for psychiatrists.This package provides a Python libary and command line tool for convenient downloading of the online contents.InstallationIf you have Python on your system you can do the usual:$ pip install accredidact-downloaderQuickstartCommand line usage examplesList all issues in the current year. For listing only, no authentication is required.$ accredidact-downloader -lList all issues in the years 2002, 2004, 2006, 2007, and 2008:$ accredidact-downloader -l -y 2002 2004 2006-2008Download the latest issue to the default download-directory. For downloading, a username and password is required for authentication.$ accredidact-downloader -u username -p password -dDownload the issues with the IDs 2022-4 and 2022-5 to the ~/Downloads download-directory. An ID is composed of the year and number of the issue.$ accredidact-downloader -u username -p password -d -i 2022-4 2022-5 -w ~/DownloadsEmail the latest issue:$ accredidact-downloader -e -u username -p password --smtp_host smtp.example.com --smtp_port 587 --smtp_username [email protected] --smtp_password <password> --sender [email protected] --recipient [email protected] command line help::$ accredidact-downloader -hConfiguration FileThe configuration fileconfig.inican change the default values for command line options. It should be written using a standard INI style. The keys should be grouped into sections. For now, the name of the sections are ignored. The section name appears on a line by itself, in square brackets ([ and ]). Configuration files may include comments, prefixed by # or ;. Comments may appear on their own on an otherwise empty line, possibly indented.LocationTheconfig.iniconfiguration file should be put in the default config directory. This location is different on different operating systems. A custom configuration file can be provided with the--configargument.Linux:$HOME/.config/accredidact-downloader/config.ini, which respects theXDG_CONFIG_HOMEenvironment variable.MacOS:$HOME/Library/Application Support/accredidact-downloader/config.iniWindows:%APPDATA%\accredidact-downloader\config.iniPrecedence / Override orderCommand line options override the values in a configuration file.Exampleconfig.ini; This is a comment # This is another comment [settings] ; a section marker is required in INI files verbose = True download_dir = /home/folkert/Downloads username = <username> ; username for authentication with https://www.accredidact.nl/ password = <password> ; password smtp_host = smtp.example.com ; host name or ip address of the SMTP server smtp_port = 587 ; port of the SMTP server smtp_username = [email protected] ; accountname, username, or email address of your email account for authentication smtp_password = <password> ; password of your email account for authentication. sender = [email protected] recipient = [email protected] an issue is downloaded, it is copied to the default download directory. This location is different on different operating systems. A custom download directory can be provided with the--download_dirargument.LocationmacOS:~/Library/Application Support/accredidact-downloaderWindows:%APPDATA%\accredidact-downloaderLinux (and other Unices):~/.local/share/accredidact-downloaderEmailYou can send an automated email message with the downloaded PDF as an attachment, for example tosend it to your Kindle. To use this feature, the--emailargument should be provided to the command line tool, along with the required arguments:--smtp_host,--smtp_port,--smtp_username,--smtp_password,--sender, and--recipient. This option presumes the--downloadargument.Please note that if you use 2-step-verification in a Gmail-account, you might need an App Password (seehttps://support.google.com/accounts/answer/185833)Changelog[0.1.1] - 2022-08-01Fix argumentsCorrect Windows default download directory in READMEFix handling of subject and body arguments[0.1.0] - 2022-03-01First release on PyPI.AuthorFolkert van der Beek -https://gitlab.com/fvdbeek
accrete
Multi Tenant App for DjangoShared approach multi tenancy system for Django Projects.InstallationAdd accrete to installed_apps and accrete.middleware.TenantMiddleware to middlewareTenantModelBase Model with a ForeignKey to tenant.Tenant and objects set to accrete.models.TenantManagerBasic Usagefromaccrete.modelsimportTenantModelclassMyModel(TenantModel):...Each user can be a member of multiple Tenants, just add them to tenant.members.This is necessary when using the Mixins and Middleware provided by this app.Middlewaretenant.middleware.TenantMiddlewareThis Middleware adds the Tenant(request.tenant) and Member(request.member) objects as attributes to the request object and sets a cookie with the tenant_id.If the user is a member of multiple tenants, the request is parsed for a tenant_id in this order.The "tenant" parameter in the POST dataThe Header X-TENANT-IDThe "tenant_id" URL Parameter in the GET dataThe "tenant_id" cookie previously set by the MiddlewareIf no tenant could be assigned the two attributes are set to None.Additionally, the user is checked for membership of the found tenant.Adds the tenant to the POST data.The Middleware must be added to the MIDDLEWARE setting after your authentication Middleware as it needs access to request.user.is_authenticated().ViewsMixinstenant.views.TenantRequiredMixinAdds tenant and optionally access right checks to the dispatch method.This Mixin is meant as a substitute to django.contrib.auth.mixins.LoginRequiredMixin as TenantRequiredMixin inherits LoginRequiredMixin.Adds the member_access_groups attribute to the view. This attribute can be a list of codes from the tenant.models.AccessGroup model. If present, the member must be part of one of the listed groups to access the view. Failing checks are handled in a similar way as in LoginRequiredMixin.Decoratorstenant.decorators.tenant_requiredSubstitute for django.contrib.auth.decorators.login_requiredChecks if a tenant is set on the request and redirects to the TENANT_MISSING_URL specified in the settings.The decorator itself is wrapped by login_required and can pass the arguments redirect_field_name and login_url to login_required.Formstenant.forms.FormForm class that adds the tenant as a field and filters the queryset of every field that has a queryset attribute.tenant.forms.ModelFormSame behaviour as tenant.forms.Form.SettingsCustom settingsTENANT_MISSING_URLRedirect to this URL when no tenant could be set for an authenticated user.TENANT_MEMBER_NOT_AUTHORIZED_URLRedirect to this URL when a member tries to access a URL without having the needed access rights.
accretion
This package has been parked by Matt Bullock to protect you against packages adopting names that might be common mistakes when looking for ours. You probably wanted to install accretion_cli. For more information, seehttps://accretion.readthedocs.io/en/stable/.
accretion-cli
The Accretion CLI is the primary point for controlling Accretion resources.The Accretion CLI maintains configuration state in a “Deployment File”.WarningAccretion is under active development and is not yet stable. The below reflects the target interface for the Accretion CLI. Not all commands will work yet.UsageinitInitialize theDEPLOYMENT_FILEfor deployments toREGIONS.This does NOT deploy to those regions.Runaccretion updateto update and fill all regions in a deployment file.accretioninitDEPLOYMENT_FILEREGIONS...updateUpdate deployments in all regions described inDEPLOYMENT_FILE.This will also initialize any empty deployment regions and complete any partial deployments.accretionupdateallDEPLOYMENT_FILEadd regionsAdd moreREGIONSto an existing deployment description inDEPLOYMENT_FILE.This does NOT deploy to those regions.Runaccretion updateto update and fill all regions in a deployment file.accretionaddregionsDEPLOYMENT_FILEREGIONS...destroyDestroy all resources for an Accretion deployment described inDEPLOYMENT_FILE.WarningThis will destroy ALL resources in ALL regions. Be sure that is what you want to do before running this.accretiondestroyDEPLOYMENT_FILErequestRequest a new layer version build.ImportantThese operations are currently completely asynchronous with no way of tracking a layer build through the CLI. I plan to add tooling around this later, but the exact form it will take is still TBD.mattsb42/accretion#27rawRequest a new layer in every region inDEPLOYMENT_FILE. The Layer must be described in the Accretion format inREQUEST_FILE.{"Name":"layer name","Language":"Language to target","Requirements":{"Type":"accretion","Requirements":[{"Name":"Requirement Name","Details":"Requirement version or other identifying details"}]},"Requirements":{"Type":"requirements.txt","Requirements":"Raw contents of requirements.txt file format"}}NoteThe only supported language at this time ispython.accretionrequestrawDEPLOYMENT_FILEREQUEST_FILErequirementsRequest a new layer namedLAYER_NAMEin every region inDEPLOYMENT_FILE. The Layer requirements must be defined in the Python requirements.txt format inREQUIREMENTS_FILE.accretionrequestDEPLOYMENT_FILEREQUIREMENTS_FILElistlayersImportantThis command has not yet been implemented.List all Accretion-managed Lambda Layers and their versions in the specified region.accretionlistlayersDEPLOYMENT_FILEREGION_NAMEdescribelayer-versionImportantThis command has not yet been implemented.Describe a Layer version, listing the contents of that Layer.accretiondescribelayer-versionDEPLOYMENT_FILEREGION_NAMELAYER_NAMELAYER_VERSIONcheckImportantThis command has not yet been implemented.Check a “Request File” for correctness.accretioncheckREQUEST_FILEDeployment FileWarningDeployment files MUST NOT be modified by anything other than Accretion tooling.An Accretion deployment file describes the stacks associated with a single Accretion deployment.It is a JSON file with the following structure:{"Deployments":{"AWS_REGION":{"Core":"STACK_NAME","ArtifactBuilder":"STACK_NAME","LayerBuilder":"STACK_NAME"}}}Request FileAn Accretion require file describes the Layer that is being requested.It is a JSON file with the following structure:{"Name":"layer name","Language":"Language to target","Requirements":{"Type":"accretion","Requirements":[{"Name":"Requirement Name","Details":"Requirement version or identifying details"}]},"Requirements":{"Type":"requirements.txt","Requirements":"Raw contents of requirements.txt file format"}}
accretion-common
WarningThis package is intended only for internal use within Accretion. This documentation is provided for information purposes only. No guarantee is provided on the modules and APIs described in here remaining consistent. Directly reference at your own risk.Common resources for Accretion components.
accretion-workers
WarningThis package is intended only for internal use within Accretion. This documentation is provided for information purposes only. No guarantee is provided on the modules and APIs described in here remaining consistent. Directly reference at your own risk.Accretion workers.
acCRISPR
acCRISPR: an activity-correction method for improving CRISPR screen accuracyFor details on usage and installation, visithttps://github.com/ianwheeldon/acCRISPR.
accrocchio
accrocchioAccrocchio is a library to mark and being notified of smelly code (a.k.a, “accrocchio”).Examplefromaccrocchio.badgeofshameimportaccrocchiofromaccrocchioimportobserversclassAClassThatSmells(metaclass=accrocchio):pass@accrocchiodefa_function_that_smells():passaccrocchio.how_many()# here we have 1, as you have declared a smelly classAClassThatSmells()accrocchio.how_many()# here we have 2, as you have created an instance of a smelly classa_function_that_smells()accrocchio.how_many()# here we have 3, as you have invoked a smelly functionaccrocchio.reset()accrocchio.how_many()# here we have 0# You can also be notified of smelly code execution, such as:classMyAccrocchioObserver(observers.AccrocchioObserver):defon_accrocchio(self):print('Another accrocchio!')defreset(self):print('Reset accrocchi')accrocchio.add_observer(MyAccrocchioObserver())a_function_that_smells()# prints 'Another accrocchio!'accrocchio.reset()# prints 'Reset accrocchi'It is also possible to decorate the entire class. Both the declaration and the instantiation of such classes increase the accrocchio counters.fromaccrocchio.badgeofshameimportaccrocchiofromaccrocchioimportobservers@accrocchioclassAClassThatSmells:passaccrocchio.how_many()# here we have 1, as you have declared a smelly classAClassThatSmells()accrocchio.how_many()# here we have 2, as you have created an instance of a smelly classYou might declare an accrocchio using type hinting, as follows:fromaccrocchio.badgeofshameimportaccrocchio,detonatordeff(a:detonator[int]):passaccrocchio.how_many()# here we have 1, as you have declared a smelly parameterdetonator.how_many()# here we have 1, as you have declared a smelly parameterf(1)accrocchio.how_many()# here we still have 1detonator.how_many()# here we still have 1The library also implementsMichael Duell’s resign patterns.fromaccrocchio.badgeofshameimportaccrocchio,detonator@accrocchiodefaccrocchio_fun():pass@detonatordefdetonator_fun():passaccrocchio_fun()accrocchio.how_many()# here we have 1, as you have invoked an accrocchio functiondetonator.how_many()# here we have 0, as you have never invoked a detonator functiondetonator_fun()detonator.how_many()# here we have 1, as you have invoked a detonator functionaccrocchio.how_many()# here we have 2, as you have invoked a detonator function, which is an accrocchioYou may mark arbitrary code as an accrocchio:fromaccrocchio.badgeofshameimportdetonator,epoxy,this_is_a,this_is_anthis_is_an(epoxy)this_is_a(detonator)detonator.how_many()# this will be 1epoxy.how_many()# this will be 1If you need to have to mark a specific portion of your code as an accrocchio, you can use it as a context manager, as follows:fromaccrocchio.badgeofshameimportdetonator,epoxywithdetonator:...withepoxy:...detonator.how_many()# this will be 1epoxy.how_many()# this will be 1For a full list of the implemented accrocchio resign patterns, please consultMichael Duell’s resign patterns.Some final notes:This library is useful only if a small part of the software is an accrocchioWe intentionally left out Python versions before 3.5, as we think they are a complete accrocchio.We intentionally did not pass the accrocchio to the ‘on_accrocchio’ observer function, as you should treat all the accrocchioes the same wayThe plural for accrocchio is accrocchioesIf you are using this library, you are deliberately brutalizing The Zen of Python; thus it has been replaced with a more appropriate one. Just doimport thisafterimport accrocchio.
accscout
Account ScoutSearch for certain user accounts on popular websites on the internetInstallationWith pip package managerpipinstallaccscoutFrom repositoryDownload repositorygitclonehttps://gitlab.com/richardnagy/security/accscoutcdaccscoutRun setup scriptMake sure you use the correct python command. On Linux systems it's python3 by default.pythonsetup.pyinstallUsageAfter installation, simply use the command with the username you're searching foraccscout[USERNAME]ExampleLicenseStandard MIT license (view)