GUI application that selects a random jackbox game based on the number of players from a curated list of favorite games.
Will ommit some games if the number of player is odd or even.
Git clone the repository to use it locally via terminal, provided the library requirements are met.
Add functionality for filtering drawing specific games
Improve UI design
Settings and preferences
Add more games to game list with future jackbox releases (Jackbox pack 10 oct 2023)
Add functionality to automatically open game from app
input .exe/steam path in the settings menu
optionally scan drives for jackbox.exe
Future work
Improvements to UX. The game list is curated based on my parties preferences, I would like to add the functionality of being able to select your own game list.
The user is presented with the entire game list and tick boxes to make their selection.
I would also like to add a settings button to adjust some preferences, for example, the app current ommits some games from being chosen based on if the player count is odd or even (this is to ensure fair and even teams), give the user the ability to turn off this feature in the settings.
We built this Comfy Twitch Chat Module live on Twitch for Coding Cafe!
Special Thanks:Comfy.JS is possible thanks to tmi.js maintained by @AlcaDesign
Comfy.JS lets you integrate with Twitch chat for your Twitch channel SUPER EASILY in just a few lines of code. Here’s a quick 3-min video on how to use it: (Click image to open video)
Instafluff
Like these projects? The best way to support my open-source projects is by becoming a Comfy Sponsor on GitHub!
varComfyJS=require("comfy.js");ComfyJS.onCommand=(user,command,message,flags,extra)=>{if(flags.broadcaster&&command==="test"){console.log("!test was typed in chat");}}ComfyJS.Init("MyTwitchChannel");
Browser
Download and add comfy.js from the dist folder or include from the JSDelivr CDN:
<html><head><scriptsrc="https://cdn.jsdelivr.net/npm/comfy.js@latest/dist/comfy.min.js"></script></head><body><scripttype="text/javascript">ComfyJS.onCommand=(user,command,message,flags,extra)=>{if(flags.broadcaster&&command==="test"){console.log("!test was typed in chat");}}ComfyJS.Init("MyTwitchChannel");</script></body></html>
Flags
Currently, the flags possible in onCommand() and onChat() are:
broadcaster
mod
founder
subscriber
vip
highlighted
customReward
Extra Parameter
Currently, the extra parameter for the onCommand() contains the following fields:
id (the message message)
channel
roomId
messageType
messageEmotes
isEmoteOnly
userId
username
displayName
userColor
userBadges
flags
timestamp
customRewardId (only works with custom channel rewards with required-text)
If the message is a command, the extra parameter will contain an additional field:
sinceLastCommand
which contains the information on the time periods in ms since the last time any user, or the specific user, has used the same
command. This field can be convenient to be used for setting global cooldown or spamming filters. See examples below:
ComfyJS.onChat=(user,message,flags,self,extra)=>{if(flags.broadcaster&&command==="test"){if(extra.sinceLastCommand.any<100){console.log(`The last '!test' command by any user was sent less than 100 ms ago`);}if(extra.sinceLastCommand.user<100){console.log(`The last '!test' command by this specific user (as denoted by the 'user' parameter) was sent less than 100 ms ago`);}}}
Reading Chat Messages
You can read chat messages by using the onChat() handler
varComfyJS=require("comfy.js");ComfyJS.onCommand=(user,command,message,flags,extra)=>{if(command==="test"){ComfyJS.Say("replying to !test");}}ComfyJS.Init(process.env.TWITCHUSER,process.env.OAUTH);
Joining a Different Channel
You can join a different channel or groups of channels by specifying in the Init()
ComfyJS.onReward=(user,reward,cost,message,extra)=>{console.log(user+" redeemed "+reward+" for "+cost);}
Comfy.JS includes functions to manage Channel Point Rewards. These functions require the ClientID used in getting the Twitch OAuth password for the channel.
This repository contains the whole ETL pipeline for Voluum data. Starting with making API-requests, thereafter data is transformed and enriched directly in main files. 2 Volum API-request schemas for Python are provided: Countries Report and Campaigns Report.
To write a proper API-request for Voluum data Chrome Developers tools are used.
(Pic was provided by the Voluum Support Team, many thanks to Katarzyna)
Load the report you need -> visit Network section in Chrome Developers tools -> choose the Name “report? blah-blah-blah” -> Copy -> Copy as cURL(bash). At this point you’ll have the exact request for the report you have created on a dashboard. To convert it to Python request use Postman, this website or any other suitable service.
The provided batch files (task_LTD.bat & …) are responsible for running the scripts and sending the data directly to GCS storage. Afterwards they run the other BigQuery.py script, which transfers data from GCS to BigQuery.
Please be aware, that for the proper work of BigQuery.py you need to setup environmental variable with GOOGLE_APPLICATION_CREDENTIALS to get access from the local PC to your project. To create json file with service account key credentials visit: https://cloud.google.com/docs/authentication/production.
From a command prompt, run the install command to set up your database.
bin/jtracker install
If you are making a change to the issue tracker’s web assets, you’ll also need to set up NPM. Please see the Asset Management documentation for more information.
Verify the installation is successful by doing the following:
View the site in your browser.
Run the get project command to pull issues, issue comments and other information related to the project from GitHub.
If you want the ‘Login with GitHub’ button to work properly you’ll need to register an app with GitHub. To do this manage your account at github.com and go to the applications page. Create a new application.
You’ll be asked for the application URL and the callback URL. This can be your test server or your localhost environment. As long as you enter the URL that your localhost app is running on. An example might be http://jissues.local.
Once you’ve registered the app at GitHub you’ll receive a Client ID and a Client Secret, enter these into your installation’s etc/config.json file, along with your GitHub login credentials. You should now be able to login with GitHub successfully.
╔════════════════════════════════════════[ BRUTALES.XYZ ]════════════════════════════════════════╗
║ ██████╗ ██████╗ ██╗ ██╗████████╗ █████╗ ██╗ ███████╗███████╗ ██╗ ██╗██╗ ██╗███████╗║
║ ██╔══██╗██╔══██╗██║ ██║╚══██╔══╝██╔══██╗██║ ██╔════╝██╔════╝ ╚██╗██╔╝╚██╗ ██╔╝╚══███╔╝║
║ ██████╔╝██████╔╝██║ ██║ ██║ ███████║██║ █████╗ ███████╗ ╚███╔╝ ╚████╔╝ ███╔╝ ║
║ ██╔══██╗██╔══██╗██║ ██║ ██║ ██╔══██║██║ ██╔══╝ ╚════██║ ██╔██╗ ╚██╔╝ ███╔╝ ║
║ ██████╔╝██║ ██║╚██████╔╝ ██║ ██║ ██║███████╗███████╗███████║ ██╔╝ ██╗ ██║ ███████╗║
║ ╚═════╝ ╚═╝ ╚═╝ ╚═════╝ ╚═╝ ╚═╝ ╚═╝╚══════╝╚══════╝╚══════╝ ╚═╝ ╚═╝ ╚═╝ ╚══════╝║
╠═══════════════════════════════[ DIGITAL STUDIO | EST. 2022 ]═══════════════════════════════════╣
║ ║
║ >> "Somos la sangre consciente y acelerada que fluye por las arterias de la tecnología. ║
║ No aceptamos la autoridad impuesta desde arriba; dibujamos nuestro propio mapa, ║
║ navegamos fuera de los territorios conocidos. Somos piratas digitales, ║
║ artistas del código, soñadores cuánticos. El ciberespacio es nuestro lienzo." ║
║ ║
║ [█░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░█] ║
║ >>> ART + CODE + CRYPTO + WEB3 + LIBERTY <<< ║
║ ║
║ [█░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░█] ║
║ ║
║ <CODE> 01010100 01001000 01000101 00100000 01000110 01010101 01010100 01010101 01010010 </> ║
║ <CODE> 01000101 00100000 01001001 01010011 00100000 01001110 01001111 01010111 00100001 </> ║
║ ║
║ >>> FOUNDER: MARI LIN ★ <<< ║
║ ║
╚══════════════════════════[ BREAK THE SYSTEM | REWRITE THE FUTURE ]═════════════════════════════╝
Brutales XYZ – Digital Studio Portfolio Template
A minimalist, cyberpunk-inspired portfolio template for digital artists and creative studios. This prototype showcases a modern, non-fungible studio aesthetic while maintaining simplicity and effectiveness.
Update the following meta tags in index.html to match your studio’s information:
<metaname="title" content="Your Studio Name"><metaname="description" content="Your Studio Description"><metaproperty="og:url" content="your-domain.com"><metaproperty="twitter:url" content="your-domain.com">
Images
Replace the following images with your own:
assets/img/favicon.ico – Website favicon
assets/img/social.png – Social media preview image
media/bxyz.svg – Studio logo
media/SIF.png and media/tob.png – Featured work images
🚀 Deployment
Clone this repository
Customize the content and images
Deploy to your preferred hosting service (GitHub Pages, Netlify, Vercel, etc.)
The template uses a custom CSS file (assets/css/css.css) for styling. Modify this file to match your brand’s aesthetic.
📝 License
This project is open source and available under the MIT License.
🤝 Contributing
Fork the repository
Create your feature branch (git checkout -b feature/AmazingFeature)
Commit your changes (git commit -m 'Add some AmazingFeature')
Push to the branch (git push origin feature/AmazingFeature)
Open a Pull Request
💫 About Brutales XYZ
Brutales XYZ is a digital art and advertising studio that combines creativity with technology. The studio specializes in various digital services including 3D modeling, web design, game development, and blockchain technology integration.
This repository contains solutions for the Capacitated Vehicle Routing Problem (CVRP), a classic optimization problem in logistics and transportation. The CVRP involves determining the optimal set of routes for a fleet of vehicles to deliver goods to a set of customers, starting and ending at a depot. Each vehicle has a maximum capacity, and the goal is to minimize the total distance traveled while ensuring that the demand of each customer is met without exceeding the vehicle capacities.
Implemented Algorithms
Random Search
Generates random solutions to the CVRP by randomly selecting customers for the routes, ensuring the capacity constraint is respected. It evaluates multiple random solutions and selects the best one.
Greedy Search
Constructs a solution incrementally by always selecting the nearest customer that doesn’t violate the capacity constraint.
There are 2 implemented versions:
Standard: Constructs a solution incrementally by always selecting the nearest customer that doesn’t violate the capacity constraint.
Randomized Greedy Search: Adds randomness by sometimes selecting one of the k-nearest customers instead of the nearest, to explore different potential solutions.
Tabu Search
An iterative optimization method for the CVRP that explores neighboring solutions by making local changes, such as swapping customers between routes. It uses a tabu list to avoid revisiting recently explored solutions, enhancing the search for a global optimum. The best solution found during the process is selected.
Genetic Algorithm
An evolutionary algorithm inspired by natural selection.
Initial Population:
The initial population consists of solutions where 80% are generated using a greedy heuristic approach, and 20% are randomly generated solutions.
Selection (Tournament Selection):
Tournament selection is used to choose parent solutions based on their fitness scores. Each tournament selects a subset of solutions randomly and picks the best-performing solution among them.
Crossover (Ordered Crossover – OX):
Ordered crossover combines genetic material from two parent solutions to create new offspring.
Steps
Flattening Parents: Parents are flattened into linear lists of nodes.
Crossover Points: Two points are randomly chosen within these lists.
Creating Offspring: Offspring inherit a segment from one parent and fill the rest with nodes from the other parent, ensuring all nodes are included while respecting vehicle capacity constraints.
Mutation (Swap Mutation):
Swap mutation randomly exchanges positions of nodes within routes in offspring solutions.
where total_distance represents the total travel distance of all vehicles in the solution, and number_of_vehicles is the count of vehicles used. Alpha (𝛼) and Beta (𝛽) are coefficients that balance the importance between minimizing total distance and minimizing the number of vehicles, respectively. Lower fitness values indicate better solutions.
Customizing Experiments
Running All Algorithms
To run all algorithms, use the run_all function. This function executes all algorithms on the specified problem instance and reports the best, worst, and average fitness across multiple executions. You have the flexibility to adjust all algorithm parameters dynamically by providing them as arguments when invoking the .run() method.
Running Experiments for the different parameters
You can customize the parameters (population_size, crossover_rate, mutation_rate) by modifying the respective arrays in the main script and using the run_experiment function. This function iterates over the specified parameter values, runs the GA with each value, and records the best, worst, and average fitness in separate CSV files for analysis.
This is a node module for parsing a TypeScript file (PolymerTS) and generating an empty documentation file (i.e. no code, only signatures) compatible with Polymer 1.0 to pass to iron-component-page for documentation. While we try to fill in anything that may be missing in the line of comments when we can, well documented code is both benefecial to new developers coming into the project as well as seasoned developers trying to remember why they may have done something the way they did when returning to a project.
Background Info
Since Red Pill Now changed our process to use TypeScript for components we lost the auto-generated documentation feature for our components. This node module attempts to restore that feature.
While we understand that the format generated here is for Polymer 1.0. Once we move to Polymer 2.0 we will need to revisit all the models to change how they are rendered. This should not be a big undertaking.
Setup
First, make sure you have Node.js installed, so we can use the Node Package Manger (NPM). Next install the other key tools:
varpolymerTsDoc=require('polymerts-doc-generator');/** * @param {string} pathToTsFile - The path to the file we want to parse * @param {string} pathToDocFile - The directory where we want to put our documentation files */varnewDocFile=polymerTsDoc.start(pathToTsFile,pathToDocFile);
This will parse the pathToTsFile and generate the empty documentation file at the path of the pathToDocFile. The file name generated will be doc_original-file-name.html.
The generated file will be suitable for passing to an iron-component-page element.
Developing
Clone the project to your local environment and run:
npm install
This project is written in typescript. There is a compile script which just runs tsc on the src directory. This will generate .js files in the respective src directories which you can then test/debug.
Supported Patterns
In order for this tool to work, there are certain patterns you need to be aware of. Since this project uses the TypeScript compiler to determine all the information about a code block it is fairly accurate and lenient in it’s pattern recognition. If it compiles then theoretically this tool should be able to parse it. Also, any comments for methods, properties, etc. will be included in the generated documentation in the appropriate place.
Component
This is the uppermost class. All other parts are stored inside the component. Once all the different parts are collected they are rendered.
The above component definition will be converted to:
<!--
This is my cool component@demo demo/index.html@hero path/to/hero.png--><dom-moduleid="my-component"><template><style></style></template><script>(function(){Polymer({is: 'my-component',behaviors: [...],properties: {...},observers: [...],listeners: {...},
...
});})();</script></dom-module>
HTML Comments
The presence of an @demo or @hero tag encountered in a comment block of the HTML File will determine which comment block will be deemed the comment block to use. All other comment blocks will be ignored.
<linkrel="import" href="../polymer/polymer.html"><linkrel="import" href="../polymer-ts/polymer-ts.html"><!-- This comment will be ignored because it is missing the appropriate tag(s) --><!--This is my element's style and example usage documentation@demo demo/index.html--><dom-moduleid="my-component"><template><style>
...
</style>
...
</template></dom-module>
Properties
If no comment is defined, one will be created and it will include the @type tag.
@component('my-component')
@behavior(Polymer['AppLocalizeBehavior']);exportclassMyComponentextendspolymer.Base{...}// OR
@component('my-component')
@behavior(Polymer.AppLocalizeBehavior);exportclassMyComponentextendspolymer.Base{...}
The above would be transformed to an array and placed in the component structure:
is: 'my-component',behaviors: [Polymer['AppLocalizeBehavior'],Polymer.AppLocalizeBehavior],
...
Observers
If an observer has only 1 parameter defined, that property will need to be defined as a @property. We will add an observer definition to that @property definition
@observe('propertyName')_onPropertyName(propertyName){...}// OR
@observe('propertyName,otherPropertyName')_onPropertyName(propertyName,otherPropertyName){...}
The above will be transformed to:
propertyName: {type: Boolean,reflectToAttribute: true,observer: '_onPropertyName'}_onPropertyName(propertyName){...}// OR
observers: ['_onPropertyName(propertyName,otherPropertyName)'],_onPropertyName(propertyName,otherPropertyName){...}
Computed Property
A new property will be created pointing to the propertyName method.
@computed()propertyName(someOtherProp){...}// OR
@computed({type: String})propertyName(someOtherProp){...}
The above computed property will be transformed to: (NOTE)Notice the type is Object. This will be the default if type is not defined in the decorator
propertyName: {type: Object,computed: 'propertyName(someOtherProp)'}propertyName(someOtherProp){...}// OR
propertyName: {type: String,computed: 'propertyName(someOtherProp)'}propertyName(someOtherProp){...}
Listener
If an @listener is defined and there is not a comment, a comment will be created with an @listens tag. If there is a comment with no @listens tag, we add an @listens tag.
@listener('someElementId.some-event')_onSomeEvent(evt: CustomEvent){...}// OR
@listener(SomeNameSpace.SOME_EVENT)_onSomeEvent(evt: CustomEvent){...}
someFunction(arg1,arg2)// OR/** * Some function * @param {any} arg1 * @param {any} arg2 * @returns {string} */someFunction(arg1,arg2){...}
Project Structure
polymerts-doc-generator
dist/
lib/
utils.js
models/
behavior.js
comment.js
component.js
computed.js
function.js
html-comment.js
listener.js
observer.js
program-part.js
property.js
index.js
src/
data/ component files for development/testing purposes
docs/ development/testing generated doc files
lib/
utils.ts
models/
behavior.ts
comment.ts
component.ts
computed.ts
function.ts
html-comment.ts
listener.ts
observer.ts
program-part.ts
property.ts
index.ts
.gitignore
gulpfile.js
package-lock.json
package.json
README.md
tsconfig.json
Reporting Bugs
Please use the Issues link in this project. Be descriptive, provide any errors that may have been produced, a snippet of the code that caused the error and if possible how to reproduce the issue.
Contributing
Pull Requests are welcome, encouraged and greatly appreciated. When contributing, please follow the same coding style present in the project. Also, if relevant, please provide comments to your changes. If your Pull Request is addressing a feature request or issue, please include the issue # in the commit. For example “Fixes issue: #123”.
Future Directions
After some conversations with the team this may be a good starting point for converting a PolymerTS/Polymer 1.x component to a PolmerTS/Polymer 2.x component. Also, converting to Polymer 2.0 should not be that big of a leap. We will just need to change the toMarkup of the models as long as the future PolymerTS 2.x maintains a similar structure.
This script simplifies the geometry of a GeoJSON file using the “shapely” library. The script takes 3 command line arguments: the input file, output file, and a tolerance value. The tolerance value represents the maximum distance that the simplified geometry can deviate from the original geometry. The script uses the ‘simplify’ method from the shapely library which uses the Douglas-Peucker algorithm, to simplify the geometry and reduce the size of the file.
Use Case
Simplifying large GeoJSON files that are too big to be easily handled and visualized.
Improving the performance of data visualization tools like Plotly, Leaflet, and D3, which often have limitations on the size of the data they can handle.
Reducing the size of large data sets for faster rendering and better performance.
Requirements
Python 3
shapely library
json library
How to Use
Install the required libraries by running “pip install shapely json”
Run the script by providing 3 command line arguments: input file, output file and tolerance value.
The input file should be a valid GeoJSON file. The output file will be the simplified version of the input file.
The tolerance value should be a decimal value between 0 and 1, representing the maximum distance that the simplified geometry can deviate from the original geometry. Default value is 0.009
The script will output the simplified GeoJSON file to the specified output file.
Note
The tolerance value is a critical factor in the simplification process, and the optimal value will depend on the specific data and the desired level of simplification. It’s important to experiment with different tolerance values to find the best balance between file size reduction and accuracy. It’s also worth noting that the tolerance ratio is varying depending on the data, so you might want to adjust it accordingly.
This script currently only supports LineString, Point, Polygon, MultiPolygon geometries, if you want to support other geometries you can add it to the script.
The script also includes an additional function ‘verifyArgv’ to check if the input, output files are valid but it’s removed as it was not providing any useful information.
The script can be used as a command line tool or can be integrated with other scripts to simplify geojson files in bulk.
Overall, this script is a useful tool for simplifying large GeoJSON files and can help improve the performance of data visualization tools. It’s easy to use and can be integrated with other scripts to simplify large data sets in bulk.