Channel specialized for advanced topics of: * Artificial intelligence, * Machine Learning, * Deep Learning, * Computer Vision, * Data Science * Python For Ads: @otchebuch & @cobbl, https://telega.io/c/computer_science_and_programming
A library for building fast, reliable and evolvable network services
💻 https://github.com/cloudflare/pingora/tree/main
Challenging programming projects you should try:
🔗https://jamesg.blog/2024/02/28/programming-projects/https://jamesg.blog/2024/02/28/programming-projects/
𝗛𝗼𝘄 𝘁𝗼 𝗰𝗼𝗱𝗲 𝘄𝗶𝘁𝗵 𝗚𝗶𝘁𝗛𝘂𝗯 𝗖𝗼𝗽𝗶𝗹𝗼𝘁?
A recent study by GitHub and Microsoft discovered that AI now authors 46% of new code. They also found that overall developer productivity surged by 55%, leading to more efficient coding processes. When we talk about AI-powered coding, we mainly talk about GitHub Copilot.
But 𝗵𝗼𝘄 𝗚𝗶𝘁𝗛𝘂𝗯 𝗖𝗼𝗽𝗶𝗹𝗼𝘁 𝘄𝗼𝗿𝗸𝘀?
The process goes in the following steps:
𝟭. 𝗦𝗲𝗰𝘂𝗿𝗲 𝗽𝗿𝗼𝗺𝗽𝘁 𝘁𝗿𝗮𝗻𝘀𝗺𝗶𝘀𝘀𝗶𝗼𝗻: Your prompts are securely sent to Copilot, ensuring data privacy.
𝟮. 𝗖𝗼𝗻𝘁𝗲𝘅𝘁𝘂𝗮𝗹 𝘂𝗻𝗱𝗲𝗿𝘀𝘁𝗮𝗻𝗱𝗶𝗻𝗴: Copilot analyzes the code around your cursor, the file type, and other open files to offer relevant suggestions.
𝟯. 𝗖𝗼𝗻𝘁𝗲𝗻𝘁 𝗳𝗶𝗹𝘁𝗲𝗿𝗶𝗻𝗴: It filters out personal data and inappropriate content, focusing solely on generating helpful code.
𝟰. 𝗖𝗼𝗱𝗲 𝗴𝗲𝗻𝗲𝗿𝗮𝘁𝗶𝗼𝗻: Based on the intent identified in your prompts, Copilot crafts code suggestions that align with your coding style and project standards.
𝟱. 𝗨𝘀𝗲𝗿 𝗶𝗻𝘁𝗲𝗿𝗮𝗰𝘁𝗶𝗼𝗻: Here, we can decide whether to use, tweak, or reject Copilot's suggestions.
𝟲. 𝗙𝗲𝗲𝗱𝗯𝗮𝗰𝗸 𝗹𝗼𝗼𝗽: Copilot learns from your interactions, improving its suggestions. Every time you tweak or reject its ideas, he knows from it. It employs techniques like zero-shot (asking without examples), one-shot (asking with an example), and few-shot learning (providing multiple examples) to adapt to our instructions, whether you provide examples or not.
𝟳. 𝗣𝗿𝗼𝗺𝗽𝘁 𝗵𝗶𝘀𝘁𝗼𝗿𝘆 𝗿𝗲𝘁𝗲𝗻𝘁𝗶𝗼𝗻: It remembers past prompts and interactions, making future suggestions more accurate.
So, how they solved it? 𝗧𝗵𝗲𝘆 𝗳𝗶𝗿𝘀𝘁 𝘁𝗿𝗶𝗲𝗱 𝘁𝗼 𝘂𝗻𝗱𝗲𝗿𝘀𝘁𝗮𝗻𝗱 𝗵𝗼𝘄 𝘁𝗵𝗲 𝘀𝘆𝘀𝘁𝗲𝗺 𝗽𝗲𝗿𝗳𝗼𝗿𝗺𝘀. They tracked what Elixir processes were doing, if they were stuck waiting on something, etc. They recorded the event types, how many of each kind of message they received, and their processing times. In addition, they tried to understand how much memory they use, the performances of garbage collectors, etc.
After the analysis, they 𝗰𝗿𝗲𝗮𝘁𝗲𝗱 𝘁𝗵𝗲 𝗳𝗼𝗹𝗹𝗼𝘄𝗶𝗻𝗴 𝘀𝘁𝗿𝗮𝘁𝗲𝗴𝘆:
𝟭. 𝗣𝗮𝘀𝘀𝗶𝘃𝗲 𝘀𝗲𝘀𝘀𝗶𝗼𝗻𝘀: Discord significantly reduced the amount of data processed and sent by differentiating between active and passive user connections, cutting the fanout work by 90% for large servers.
𝟮. 𝗥𝗲𝗹𝗮𝘆𝘀: Implementing a relay system (read - multithreading) allowed Discord to split the fanout process across multiple machines, enabling a single guild to utilize more resources and support more prominent communities. Relays maintain connections to the sessions instead of the guild and are responsible for doing fanout with permission checks.
𝟯. 𝗪𝗼𝗿𝗸𝗲𝗿 𝗽𝗿𝗼𝗰𝗲𝘀𝘀𝗲𝘀 𝗮𝗻𝗱 𝗘𝗧𝗦: To maintain server responsiveness, Discord employed worker processes and Erlang Term Storage (ETS) for operations requiring iteration over large sets of members, thus avoiding bottlenecks in the guild process. ETS is an in-memory database that supports the ability of multiple Elixir processes to access it safely. This enables the creation of a new worker process and passes the ETS table so this process can run expensive operations and offload the central guild server.
🔗https://discord.com/blog/maxjourney-pushing-discords-limits-with-a-million-plus-online-users-in-a-single-server
𝟮𝟬 𝗦𝗤𝗟 𝗾𝘂𝗲𝗿𝘆 𝗼𝗽𝘁𝗶𝗺𝗶𝘇𝗮𝘁𝗶𝗼𝗻 𝘁𝗲𝗰𝗵𝗻𝗶𝗾𝘂𝗲𝘀
Below are the SQL query optimization techniques that I found to be significant, listed in the top 20:
1. Create an index on huge tables (>1.000.000) rows
2. Use EXIST() instead of COUNT() to find an element in the table
3. SELECT fields instead of using SELECT *
4. Avoid Subqueries in WHERE Clause
5. Avoid SELECT DISTINCT where possible
6. Use WHERE Clause instead of HAVING
7. Create joins with INNER JOIN (not WHERE)
8. Use LIMIT to sample query results
9. Use UNION ALL instead of UNION wherever possible
10. Use UNION where instead of WHERE ... or ... query.
11. Run your query during off-peak hours
12. Avoid using OR in join queries
14. Choose GROUP BY over window functions
15. Use derived and temporary tables
16. Drop the index before loading bulk data
16. Use materialized views instead of views
17. Avoid != or <> (not equal) operator
18. Minimize the number of subqueries
19. Use INNER join as little as possible when you can get the same output using LEFT/RIGHT join.
20. Frequently try to use temporary sources to retrieve the same dataset.
𝗦𝘁𝗮𝗰𝗸 𝗢𝘃𝗲𝗿𝗳𝗹𝗼𝘄 𝗔𝗿𝗰𝗵𝗶𝘁𝗲𝗰𝘁𝘂𝗿𝗲 𝗜𝘀 𝗡𝗼𝘁 𝗪𝗵𝗮𝘁 𝗬𝗼𝘂 𝗠𝗲𝗮𝗻 𝗜𝘁 𝗜𝘀
In the recent interview with Scott Hanselman, 𝗥𝗼𝗯𝗲𝗿𝘁𝗮 𝗔𝗿𝗰𝗼𝘃𝗲𝗿𝗱𝗲, 𝗛𝗲𝗮𝗱 𝗢𝗳 𝗘𝗻𝗴𝗶𝗻𝗲𝗲𝗿𝗶𝗻𝗴 𝗮𝘁 𝗦𝘁𝗮𝗰𝗸 𝗢𝘃𝗲𝗿𝗳𝗹𝗼𝘄, revealed the story about the architecture of Stack Overflow. They handle more than 6000 requests per second, 2 billion page views per month, and they manage to render a page in about 12 milliseconds. If we think about it a bit, we could imagine they use some kind of 𝗺𝗶𝗰𝗿𝗼𝘀𝗲𝗿𝘃𝗶𝗰𝗲 𝘀𝗼𝗹𝘂𝘁𝗶𝗼𝗻 𝘁𝗵𝗮𝘁 𝗿𝘂𝗻𝘀 𝗶𝗻 𝘁𝗵𝗲 𝗖𝗹𝗼𝘂𝗱 𝘄𝗶𝘁𝗵 𝗞𝘂𝗯𝗲𝗿𝗻𝗲𝘁𝗲𝘀.
But the story is a bit different. Their solution is 15 years old, and it is a 𝗯𝗶𝗴 𝗺𝗼𝗻𝗼𝗹𝗶𝘁𝗵𝗶𝗰 𝗮𝗽𝗽𝗹𝗶𝗰𝗮𝘁𝗶𝗼𝗻 𝗿𝘂𝗻𝗻𝗶𝗻𝗴 𝗼𝗻-𝗽𝗿𝗲𝗺𝗶𝘀𝗲𝘀. It is actually 𝗮 𝘀𝗶𝗻𝗴𝗹𝗲 𝗮𝗽𝗽 on IIS, which runs 200 sites. This single app is running on nine web servers and a single SQL Server (with the addition of one hot standby).
They also use 𝘁𝘄𝗼 𝗹𝗲𝘃𝗲𝗹𝘀 𝗼𝗳 𝗰𝗮𝗰𝗵𝗲, one on SQL Server with large RAM (1.5TB), where they have 30% of DB access in RAM and also they use two Redis servers (master and replica). Besides this, they have 3 tag engine servers and 3 Elastic search servers, which are used for 34 million daily searches.
All this is handled by a 𝘁𝗲𝗮𝗺 𝗼𝗳 𝟱𝟬 𝗲𝗻𝗴𝗶𝗻𝗲𝗲𝗿𝘀, who manage to 𝗱𝗲𝗽𝗹𝗼𝘆 𝘁𝗼 𝗽𝗿𝗼𝗱𝘂𝗰𝘁𝗶𝗼𝗻 𝗶𝗻 𝟰 𝗺𝗶𝗻𝘀 several times daily.
Their 𝗳𝘂𝗹𝗹 𝘁𝗲𝗰𝗵 𝘀𝘁𝗮𝗰𝗸 is:
🔹 C# + ASP. NET MVC
🔹 Dapper ORM
🔹 StaeckExchange Redis
🔹 MiniProfiler
🔹 Jil JSON Deseliazier
🔹 Exceptional logger for SQL
🔹 Sigil, a .Net CIL generation helper (for when C# isn’t fast enough)
🔹 NetGain, a high-performance web socket server
🔹 Opserver, monitoring dashboard polling most systems and feeding from Orion, Bosun, or WMI.
🔹 Bosun, backend monitoring system, written in Go
💻 Are you looking for a virtual server with powerful DDoS protection? Look no further than Aéza!
Features:
– High-performance virtual servers up to 6.0 GHz on flagship processors
– A wide range of services, such as ready-made proxy rental in your account
– Powerful and professional DDoS protection
– Free Anycast DDoS protection for your websites
– Affordable market price starting from 4.94 euros for Ryzen 7950x3D
– 24/7 support chat
– Anonymous VPN from 1.9 euros with various locations included
– 15% cashback using the link
And this is not all Aéza can offer you!
🖥 Free servers!
Take a server from Aézа for 1 hour or explore our Terminator free rental program
No registration, SMS, or payment is required!
aeza.net
𝗛𝗼𝘄 𝘁𝗼 𝗱𝗼 𝗰𝗼𝗱𝗲 𝗿𝗲𝘃𝗶𝗲𝘄𝘀 𝗽𝗿𝗼𝗽𝗲𝗿𝗹𝘆
An essential step in the software development lifecycle is code review. It enables developers to enhance code quality significantly. It resembles the authoring of a book. The author writes the story, which is then edited to ensure no mistakes like mixing up "you're" with "yours." Code review in this context refers to examining and assessing other people's code.
There are different 𝗯𝗲𝗻𝗲𝗳𝗶𝘁𝘀 𝗼𝗳 𝗮 𝗰𝗼𝗱𝗲 𝗿𝗲𝘃𝗶𝗲𝘄: it ensures consistency in design and implementation, optimizes code for better performance, is an opportunity to learn, and knowledge sharing and mentoring, as well as promotes team cohesion.
What should you look for in a code review? Try to look for things such as:
🔹 𝗗𝗲𝘀𝗶𝗴𝗻 (does this integrate well with the rest of the system, and are interactions of different components make sense)
🔹 𝗗𝘂𝗻𝗰𝘁𝗶𝗼𝗻𝗮𝗹𝗶𝘁𝘆 (does this change is what the developer intended)
🔹 𝗗𝗼𝗺𝗽𝗹𝗲𝘅𝗶𝘁𝘆 (is this code more complex than it should be)
🔹 𝗡𝗮𝗺𝗶𝗻𝗴 (is naming good?)
🔹 𝗘𝗻𝗴. 𝗽𝗿𝗶𝗻𝗰𝗶𝗽𝗹𝗲𝘀 (solid, kiss, dry)
🔹 𝗧𝗲𝘀𝘁𝘀 (are different kinds of tests used appropriately, code coverage),
🔹 𝗦𝘁𝘆𝗹𝗲 (does it follow style guidelines),
🔹 𝗗𝗼𝗰𝘂𝗺𝗲𝗻𝘁𝗮𝘁𝗶𝗼𝗻, etc.
⚡️FuturesAI unleashes a game changing decentralized futures trading platform, supercharging traders of all skill levels to execute trades with premium efficiency and lightning speed ⚡️
- trade completely anonymous
No KYC required🥷
Get started in under 60 seconds
💎Proprietary technology that offers dynamic (wildly volatile) markets with unique Futures contracts - up to 20X leverage. $FAI token holders receive massive revenue share + ongoing buyback & burns of $FAI alongside exclusive features and access.
Presale lasts for only a few more days - you wont miss your chance to invest in $FAI at the bottom and join the biggest launch of 2024!
Highlights:
⚡️Utility is COMPLETE (demo here)
⚡️0 Tax Token - Real Revenue
⚡️Massive Revenue Share - Multiple ways to earn
𝗗𝗶𝗱 𝗜 𝗴𝗶𝘃𝗲 𝗺𝘆 𝗯𝗲𝘀𝘁 𝗹𝗮𝘀𝘁 𝘄𝗲𝗲𝗸?
There are no two same days nor two same weeks
The "best" can mean different on "different" days
This is why we need to have weekly and monthly goals
And the results are that matters, not the effort
I wish you a great week ahead 👋
𝗛𝗼𝘄 𝘁𝗼 𝘂𝘀𝗲 𝘂𝗻𝗱𝗼𝗰𝘂𝗺𝗲𝗻𝘁𝗲𝗱 𝗪𝗲𝗯 𝗔𝗣𝗜𝘀?
There are several methods to tackle this issue, primarily involving intercepting traffic originating from a web API. If the goal is to intercept HTTP/HTTPS traffic from various sources, one approach involves manually constructing a custom sniffer. However, this method can be burdensome as it requires tailoring the solution for each API individually.
Now, Postman offers a solution to sniff traffic from any API with the HTTP/HTTP protocol. What is good about this feature is that traffic capture enables you to generate a Postman collection, which you can then use to test, evaluate, and document captured APIs.
Check more at the following link:
🔗https://blog.postman.com/introducing-postman-new-improved-system-proxy/.
Implementing RSA in Python from Scratch
🔗https://coderoasis.com/implementing-rsa-from-scratch-in-python/https://coderoasis.com/implementing-rsa-from-scratch-in-python/
𝗚𝗶𝘁 𝗠𝗲𝗿𝗴𝗲 𝘃𝘀 𝗥𝗲𝗯𝗮𝘀𝗲
One of the most powerful Git features is branching. Yet, while working with it, we must integrate changes from one branch into another. The way how to do this can be different.
We have two ways to do it:
𝟭. 𝗠𝗲𝗿𝗴𝗲
When you merge Branch A into Branch B (with 𝚐𝚒𝚝 𝚖𝚎𝚛𝚐𝚎), Git creates a new merge commit. This commit has two parents, one from each branch, symbolizing the confluence of histories. It's a non-destructive operation, preserving the exact history of your project, warts, and all. Merges are particularly useful in collaborative environments where maintaining the integrity and chronological order of changes is essential. Yet, merge commits can clutter the history, making it harder to follow specific lines of development.
𝟮. 𝗥𝗲𝗯𝗮𝘀𝗲
When you rebase Branch A onto Branch B (with 𝚐𝚒𝚝 𝚛𝚎𝚋𝚊𝚜𝚎), you're essentially saying, "Let's pretend these changes from Branch A were made on top of the latest changes in Branch B." Rebase rewrites the project history by creating new commits for each commit in the original branch. This results in a much cleaner, straight-line history. Yet, it could be problematic if multiple people work on the same branch, as rebasing rewrites history, which can be challenging if others have pulled or pushed the original branch.
So, when to use them:
🔹 𝗨𝘀𝗲 𝗺𝗲𝗿𝗴𝗶𝗻𝗴 𝘁𝗼 𝗽𝗿𝗲𝘀𝗲𝗿𝘃𝗲 𝘁𝗵𝗲 𝗰𝗼𝗺𝗽𝗹𝗲𝘁𝗲 𝗵𝗶𝘀𝘁𝗼𝗿𝘆, especially on shared branches or for collaborative work. It's ideal for feature branches to merge into a main or develop branch.
🔹 𝗨𝘀𝗲 𝗿𝗲𝗯𝗮𝘀𝗶𝗻𝗴 𝗳𝗼𝗿 𝗽𝗲𝗿𝘀𝗼𝗻𝗮𝗹 𝗯𝗿𝗮𝗻𝗰𝗵𝗲𝘀 or when you want a clean, linear history for easier tracking of changes. Remember to rebase locally and avoid pushing rebased branches to shared repositories. Also, be aware 𝗻𝗼𝘁 𝘁𝗼 𝗿𝗲𝗯𝗮𝘀𝗲 𝗽𝘂𝗯𝗹𝗶𝗰 𝗵𝗶𝘀𝘁𝗼𝗿𝘆. If your branch is shared with others, rebasing can rewrite history in a way that is disruptive and confusing to your collaborators.
𝗗𝗼 𝘆𝗼𝘂 𝘀𝘂𝗳𝗳𝗲𝗿 𝗳𝗿𝗼𝗺 𝗜𝗺𝗽𝗼𝘀𝘁𝗲𝗿 𝗦𝘆𝗻𝗱𝗿𝗼𝗺𝗲?
Always believing that you must know everything before doing?
Adjust your viewpoint.
Be a smart learner!
Introducing - Azure Wallet
The next generation of DeFi wallets has arrived. Azure Wallet provides exceptional performance and a suite of professional trading tools, backed by industry leading security features. No matter your experience level, Azure Wallet elevates every trader to a professional level.
With the recent launch of $AZURE token on the Ethereum blockchain and rapid growth in holders and value, Azure is poised to overtake a multi-billion dollar wallet industry. If you have noticed the cryptocurrency markets providing incredible gains, you are surely aware of the opportunity at these levels. Join the exciting ride with Azure today by jumping in the Telegram community and grabbing your share of $AZURE on Uniswap or in the Azure Wallet!
🚀🚀Coming this week
💥 Tier 1 Exchange Listing
💻 Google Chrome Extension Release
🤝 9-Figure Project Wallet Integration
📣 Major Partnership Announcements
💰 Huge KOL & Influencer Campaigns
Website
Telegram Community
$AZURE Chart
CoinGecko
Buy $AZURE (Uniswap)
Play Store Download
Use KeePassXC to sign your git commits
🔗 https://code.mendhak.com/keepassxc-sign-git-commit-with-ssh/
Who's here?
We've asked for a free link to a paid channel, for our subs.
x2-x3 Signals here
👉 CLICK HERE TO JOIN 👈
👉 CLICK HERE TO JOIN 👈
👉 CLICK HERE TO JOIN 👈
❗️JOIN FAST! FIRST 1000 SUBS WILL BE ACCEPTED
¡Hola! 👋🏻
AmigoChat - AI GPT bot. Best friend and assistant:
- generate images
- get ideas and hashtags for social media
- write SEO texts
- rewrite and summarize longreads
- choose a promotion plan
- chat and ask questions
Everything is FREE because amigos don't take dineros for help! 🤠
𝗛𝗼𝘄 𝗱𝗼𝗲𝘀 𝗗𝗶𝘀𝗰𝗼𝗿𝗱 𝗵𝗮𝗻𝗱𝗹𝗲 𝗮 𝗺𝗶𝗹𝗹𝗶𝗼𝗻 𝗼𝗻𝗹𝗶𝗻𝗲 𝘂𝘀𝗲𝗿𝘀 𝗶𝗻 𝗮 𝘀𝗶𝗻𝗴𝗹𝗲 𝘀𝗲𝗿𝘃𝗲𝗿?
As time passed, the overall size of Discord's user base, including its most prominent communities, has grown massively. This affected servers that started to slow down and hit their throughput limits. So, they needed to scale individual Discord servers from tens of thousands to millions of concurrent users.
Whenever someone sends a message on Discord or joins a channel, they need to update the date UI of everyone online on that server. They call that server a "𝗴𝘂𝗶𝗹𝗱," which runs in a 𝘀𝗶𝗻𝗴𝗹𝗲 𝗘𝗹𝗶𝘅𝗶𝗿 𝗽𝗿𝗼𝗰𝗲𝘀𝘀, while there is another process (a "𝘀𝗲𝘀𝘀𝗶𝗼𝗻") for each connected client. The guild process tracks sessions of users who are members of that guild and are responsible for actions to those sessions. When sessions get updates, forward them to the web socket socket to the client.
The main issue is that 𝗮 𝘀𝗶𝗻𝗴𝗹𝗲 𝗺𝗲𝘀𝘀𝗮𝗴𝗲 𝗻𝗲𝗲𝗱𝘀 𝘁𝗼 𝗴𝗼 𝘁𝗼 𝘁𝗵𝗲 𝗻𝘂𝗺𝗯𝗲𝗿 𝗼𝗳 𝗽𝗲𝗼𝗽𝗹𝗲 𝗼𝗻𝗹𝗶𝗻𝗲 on that server, which means if a server has 1000 people online and they all send a message once, that's 1 million notifications.
⏰Only a few hours left to join the Solcraft presale
Crypto markets are rebounding quickly and Bitcoin halving is in less than 5 days 🚀
Combine the most successful game title (Minecraft) with the fastest growing blockchain (Solana) and you create the hottest GameFi project of 2024 • Solcraft
✅Custom Minecraft Server - Live
✅$SOFT Token Integration Complete
✅Team with Ex-Microsoft & Steam Developers
Huge Partnerships 🤝 Massive Launch April 16th
The $SOFT presale ends in under 24 hours 👇
Presale
Find out more: Website • TG • Twitter
𝗛𝗼𝘄 𝗧𝗼 𝗘𝗻𝗮𝗯𝗹𝗲 𝗖𝗼𝗻𝘁𝗶𝗻𝘂𝗼𝘂𝘀 𝗜𝗻𝘁𝗲𝗴𝗿𝗮𝘁𝗶𝗼𝗻 𝘄𝗶𝘁𝗵 𝗣𝘂𝗹𝗹 𝗥𝗲𝗾𝘂𝗲𝘀𝘁𝘀?
With Pull Requests, we lost the ability to have a proper Continuous Integration (CI) process in a way that delayed integration due to code reviews. So here comes a “Ship/Show/Ask” branching strategy. The thing is that not all pull requests need code reviews.
So, whenever we make a change, we have three options:
🔹 𝗦𝗵𝗶𝗽 - Small changes that don’t need people’s review can be pushed directly to the main branch. We have some build pipelines running on the main brunch, which run tests and other checks, so it is a safety net for our changes. Some examples are: fixing a typo, increasing the minor dependency version, updated documentation.
🔹 𝗦𝗵𝗼𝘄 - Here, we want to show what has been done. When you have a branch, you open a Pull Request and merge it without a review. Yet, you still want people to be notified of the change (to review it later), but don’t expect essential discussions. Some examples are: a local refactoring, fixing a bug, added a test case.
🔹 𝗔𝘀𝗸 - Here, we make our changes and open a Pull Request while waiting for feedback. We do this because we want a proper review in case we need clarification on our approach. This is a classical way of making Pull Requests. Some examples are: Adding a new feature, major refactoring, and proof of concept.
Encryption and Decryption using Linear Algebra with C++
This project implements a text encryption and decryption system using a matrix-based encryption technique. This project serves as an educational and practical exploration of matrix-based encryption techniques, demonstrating the fundamental concepts of encryption and decryption in a user-friendly manner.
💻https://github.com/farukalpay/TextEncryptionWithLinearAlgebra
❌ THE MOST PRIVATE GROUP №1 ❌
They are robbing Crypto Exchanges for Millions of dollars!
Yesterday profit = 50,000$+
👉 /channel/+shrfpKMaEw9jY2Rl
👉 /channel/+shrfpKMaEw9jY2Rl
👉 /channel/+shrfpKMaEw9jY2Rl
Join fast! First 1000 subs will be accepted! 👀🚀
Do you enjoy reading this channel?
Perhaps you have thought about placing ads on it?
To do this, follow three simple steps:
1) Sign up: https://telega.io/c/computer_science_and_programming
2) Top up the balance in a convenient way
3) Create an advertising post
If the topic of your post fits our channel, we will publish it with pleasure.
📣 Up to $610 in $BTC Bonuses and $20 BTC for New Crypto Team Users 📣
🗣 If you're new to Bybit crypto exchange, you're going to want to hear about this!
🎁 Claiming your $20 BTC and $10 BTC bonus is as easy as 1-2-3. You can even win an extra $600 worth!
👉 And that's just the start of Bybit's mind-blowing promotions.
🤩 Join Crypto Team to lern more
🚨 ATTENTION 🚨
Friends, I asked for a special link from binance free vip channel, don't miss ❗
Only 100 Members Exclusive Link 👇👇👇
/channel/+tY1KS_VpiFozNWZi
LIMITED TIME OPEN LINK ❗
𝗧𝗵𝗲 𝗕𝗲𝘀𝘁 𝗦𝗼𝗳𝘁𝘄𝗮𝗿𝗲 𝗔𝗿𝗰𝗵𝗶𝘁𝗲𝗰𝘁𝘂𝗿𝗲 𝗕𝗼𝗼𝗸𝘀 𝗜𝗻 𝗘𝘃𝗲𝗿𝘆 𝗖𝗮𝘁𝗲𝗴𝗼𝗿𝘆
Check out this list of all books tagged with software architecture. They are ranked based on Goodreads score with applied simple algorithmic rules (relevant to software architecture, content is not obsolete, it must be tech agnostic, and average rating > 3.5). Rating is based on the number of written reviews, including the average rating, the number of ratings, and the publishing date.
💻 https://github.com/mhadidg/software-architecture-books
ATTENTION!!
+1000% coin will be posted in BINANCE WHALE'S LEAK🚀🚀
Link open only for LIMITED TIME🕓
JOIN FAST👀👇
/channel/+rDT7H_njmis4ODQ0
𝗟𝗲𝗮𝗿𝗻 𝗳𝘂𝗻𝗱𝗮𝗺𝗲𝗻𝘁𝗮𝗹𝘀, 𝗻𝗼𝘁 𝗳𝗿𝗮𝗺𝗲𝘄𝗼𝗿𝗸𝘀
Have you ever wondered why some technologies are still with us, and some disappeared? Here is 𝘁𝗵𝗲 𝗟𝗶𝗻𝗱𝘆 𝗘𝗳𝗳𝗲𝗰𝘁 to explain it. This effect tells me that 𝗯𝘆 𝘁𝗵𝗲 𝘁𝗶𝗺𝗲 𝗜 𝗿𝗲𝘁𝗶𝗿𝗲, 𝗱𝗲𝘃𝗲𝗹𝗼𝗽𝗲𝗿𝘀 𝘄𝗶𝗹𝗹 𝘀𝘁𝗶𝗹𝗹 𝗯𝗲 𝘂𝘀𝗶𝗻𝗴 𝗖# 𝗮𝗻𝗱 𝗦𝗤𝗟. It is a concept in technology and innovation that suggests that the future life expectancy of a non-perishable item is proportional to its current age. In other words, the longer an item has been in use, the longer it is likely to continue to be used.
The concept was named after Lindy's Deli in New York City, where Nassim Nicholas Taleb popularized it in his book "𝗧𝗵𝗲 𝗕𝗹𝗮𝗰𝗸 𝗦𝘄𝗮𝗻." According to Taleb, the Lindy effect applies to many things, including technologies, ideas, and cultures, and evaluates their potential longevity.
In software development, we see that 𝗳𝗿𝗮𝗺𝗲𝘄𝗼𝗿𝗸𝘀 𝗰𝗼𝗺𝗲 𝗮𝗻𝗱 𝗴𝗼, 𝗯𝘂𝘁 𝗹𝗮𝗻𝗴𝘂𝗮𝗴𝗲𝘀 𝘀𝘂𝗰𝗵 𝗮𝘀 𝗦𝗤𝗟 𝗼𝗿 𝗖# 𝗮𝗻𝗱 𝗰𝗼𝗻𝗰𝗲𝗽𝘁𝘀 𝘀𝘂𝗰𝗵 𝗮𝘀 𝗢𝗯𝗷𝗲𝗰𝘁-𝗼𝗿𝗶𝗲𝗻𝘁𝗲𝗱 𝗽𝗿𝗼𝗴𝗿𝗮𝗺𝗺𝗶𝗻𝗴 𝗼𝗿 𝗦𝗢𝗟𝗜𝗗 𝗽𝗿𝗶𝗻𝗰𝗶𝗽𝗹𝗲𝘀 𝘀𝘁𝗮𝘆. All the energy I put into learning those technologies 10-15 years ago continues to support my work today. Some things changed, but the fundamentals stayed and even got better.
So, try to 𝗹𝗲𝗮𝗿𝗻 𝘁𝗵𝗶𝗻𝗴𝘀 𝘁𝗵𝗮𝘁 𝗱𝗼𝗻'𝘁 𝗰𝗵𝗮𝗻𝗴𝗲 (quote from Jeff Bezos). Focus on foundations, not frameworks. I've been doing this for two decades now.
UNTANGLE Spring Security Architecture 🔒
Authentication and Authorization:
- Validates user identity and orchestrates controlled resource access.
- Empowers comprehensive user authentication and nuanced authorization.
Security Filters:
- Intercepts incoming requests, meticulously enforcing security measures.
- Offers a flexible, layered security filter chain for diverse protection strategies.
Custom Authentication Providers:
- N Authentication Provider: Extends authentication capabilities beyond default configurations. Facilitates tailored authentication strategies and seamless integration.
- DaoAuthentication Provider: Adopts a database-backed approach for user authentication. Scrutinizes user credentials against stored records, heightening security.
Authentication Manager:
- Orchestrates the authentication process, coordinating various authentication providers.
- Serves as a pivotal component in managing user identity verification.
Token-based Security (JWT):
- Implements advanced token-based authentication for stateless communication.
- Facilitates secure interaction without the need for server-side storage.
Session Management:
- Efficiently manages user sessions, mitigating session-related risks.
- Provides adaptability for session creation, tracking, and invalidation.
Authentication Tokens:
- Username Password Authentication Token:Represents user credentials for authentication purposes.
- Leverages usernames and passwords for robust user verification.
Add/Remove Authentication Token:
- Dynamically enables the addition and removal of authentication tokens.
- Ensures real-time control over user authentication, promoting flexibility.