Tesseract that "quirky command-line tool that does an outstanding job" (credit A. Kay) truly works well. Give it a shot whenever you get the opportunity.
Sample commands for ref:
- tesseract IMG-1.jpg IMG-1 --psm 4
- tesseract -l eng IMG-2.jpg b1
Insights on Java, Big Data, Search, Cloud, Algorithms, Data Science, Machine Learning...
The Marble's Equity Distribution model can be used by very early stage founders to fairly split equity among themselves. The key aspect of the model is to split the Equity across the founders based on their contributions to different Areas such as Sales, Marketing, Technology, etc (see Col. 'Area'), where each Area is assigned a relative weight (see Col. 'AreaContrib').
For the calculation shown below, number of founders is set as N=3, with the weight of the Marble W=2. In a hypothetical set-up all founders contribute equally to every area, the Equity split per head in that case is 32 (see Col. 'AreaContribPerFounder') or the Marble split per head is 16 (see Col. 'MarblePerAreaContribFounder').
However, in reality the 1st founder (Fndr_1) is more of a Ops., Customer Support, HR generalist with some prior domain knowledge and the idea. Fndr_2 is the technologist with some strategy & management experience expected to come up with smart digital solutions. Fndr_3 carries wide Sales, Marketing, Ops, experience along with the Relations among the Investor. Fndr_3 also brought in the lions share of the initial seed capital used to boot the startup.
After the re-balancing of the marbles the Final Equity distribution stands at 23, 33 & 44 for the three founders. A further 5 - 7 % of re-balancing could take place depending upon the discussions and negotiations between the founders. While there is a no correct way of equity distribution having a framework like the Marble Equity Distribution helps to keep the process objective and free from let downs, clashes and unfairness.
S.No | Area | AreaContrib | AreaContribPerFounder = AreaContrib/ NoFounders |
MarblePerAreaContribFounder = AreaContribPerFounder / WeightMarble |
Fndr_1 | Fndr_2 | Fndr_3 |
1 | Sales | 22 | 7 | 3.5 | 0 | -3.5 | 3.5 |
2 | Marketing | 10 | 3 | 1.5 | -1.5 | -1 | 2.5 |
3 | Technology | 20 | 7 | 3.5 | -3.5 | 7 | -3.5 |
4 | Operations, Customer Support |
10 | 3 | 1.5 | 0.5 | -1 | 0.5 |
5 | HR | 10 | 3 | 1.5 | -1 | 2 | -1 |
6 | Legal | 3 | 1 | 0.5 | -0.5 | -0.5 | 1 |
7 | Finance, Payroll | 5 | 2 | 1 | -1 | -1 | 2 |
8 | Investor Relations, Networking |
10 | 3 | 1.5 | -1.5 | -1 | 2.5 |
9 | Initial Investment, Seed Capital |
10 | 3 | 1.5 | -1.5 | -1 | 2.5 |
10 | Total | 100 | 32 | 16 | -10 | 0 | 10 |
11 | Initial Distribution | 33 | 33 | 34 | |||
12 | Final Distribution | 23 | 33 | 44 | |||
NoFounders (N) | 3 | ||||||
WeightMarble (W) | 2 |
At a time when major cities across India like Bangalore and Delhi are experiencing a major water crisis, critical interventions are the need of the hour. For what could work one should look at fellow nation South Africa for their handling the Day Zero crisis. The day when there is no more portable water available for use by the citizen. Here's some related coverage "Day Zero: Where Next?" (https://www.nationalgeographic.com/science/article/partner-content-south-africa-danger-of-running-out-of-water) & "Bengaluru is dying of thirst because it’s drinking its own Kool-Aid" (https://the-ken.com/the-nutgraf/bengaluru-is-dying-of-thirst-because-its-drinking-its-own-kool-aid/).
Reverse Osmosis (RO) water purifiers are both a boon & bane for the average household. Whith supply water TDS remaining way way above the palatable levels, ordinary non-RO basic filteration machines are rendered useless. But then RO machines end up throwing away waste water to the levels of about 5 - 10 litres (depending upon various factors) for every litre of drinking water purified. A criminal waste of the precious resource!
Now we've been recycling the RO Waste water for other household cleaning, watering, etc purposes. This process might take you back in time to the days of filling up buckets of water from the supply line (a reality for many to this day), well, etc for use. Though a bit cumbersome this recycling bit works. In a span of one day we may be able to collect about 2 - 3 buckets (30 - 40 litres) of water which would otherwise have gone down the drain. All said and done, it's well worth the effort!
There's been a huge churn within the telecom sector in India over the last couple of years. These were precipitated by the laws and taxation policies in force, along with some rampant noncompetitive predatory practices, rock bottom pricing, m & a, essential upgrades to technology and infrastructure and so on, alongside a high growth of mobile phone & internet users (100+ crores) in the country.
The last factor helped bridge the connectivity and digital divide between the rural and the urban parts of the nation. Users of all age groups from all over came on board and got hooked on to social media sites and chatting apps. Online meetings, business communications, schooling, banking and payments, etc. went the mobile apps route.
While things were good for a while, a flip side to the story emerged soon. The large scale adoption was brought on by unsustainable predatory pricing by the players, esp. the new entrants. Which was closely followed by lowering of prices by the rest of the players. This race to the bottom, as expected, led to sinking of all but the most financially solvent ones. While some exited, others merged, the rest continue to struggle to stay afloat. An upward revision of prices therefore seems like the only way out of this mess.
On the other hand, a rise in prices will likely result in a drop in the number of users, particularly from the marginalized and weaker sections of the population. Perhaps a study is in order (or already done) which shows the impact per thousand (or lakh) users for every rupee (or ten) increase in prices. This mobile inclusivity for the citizens gained at long last must not be lost at any cost. Incorrect policies, corporate practices, profit motives, etc. of the past should not result in the nation regressing on the digital inclusivity front.
A sure shot Catch-22 for the policy makers from the sector:
- To save the telecom players (via upward price revisions), or
- To preserve/ promote digital inclusivity for the citizens (particularly for the vulnerable).
One option that can be considered is to relook at the telecom pricing model. Telecom players these days offer various "Unlimited Plans". These have bundled unlimited data and call time (with daily sub-limits of a few gigs, minutes, etc.). These are therefore among the most popular plans and have led to an explosion of daily usage. People no longer care about the usage, while calling or using the internet/ data packs. As a result mobile bandwidths are practically choked all through the day. Poor quality services including frequent call drops, false rings, slow data connections are a menace for everyone. Plus there is also the adverse impact on the environment due to constant energy wastage happening at the level of the devices, network, switches, mobile towers, and so on.
The telecom pricing model in the past was the much more sensible "Pay-As-You-Go" model. Just like other shared basic need utilities available in a limited supply such as water, electricity, etc. telecom bandwidth (service) should also revert to the standard pay-as-you-go. This prevents wastage and allows a much fairer distribution of constrained resources.
There is a flat/ fixed nominal monthly subscription charge, and a variable usage cost billed per unit. Additionally by having separate consumption slabs, heavy/ corporate users can be made to pay more (as per a high cost slab), while the normal/ light users allowed to pay less. Thereby, making it easy on the pocket of the normal user and yet profitable for the telecom players.
The allied benefit of pay-as-you-go pricing will be that the number of mindless forwards, fakes, misinformation/ disinformation will go down, if not entirely disappear. Most people would be averse to spending even a few rupees daily towards the barrage of forwards and fakes. Resulting in a socially better and environmentally healthier world to live in!
Finally, to ensure inclusivity for the needy segments of the society and so that nobody gets left out, a separate "Janta (Citizen) Mobile Plan" could be introduced & a direct-to-account bill subsidy constituted. These little changes along with other significant ones on the corporate policy, laws, taxation and fair trade practices sides will ensure that India regains its lost ground in the telecom sector.
The instructions for installing Canon LBP2900 Printer on Ubuntu 20.04 remain pretty much the same as given in an earlier post about Installing Canon LBP2900 Printer on Ubuntu 16.04. The few differences seen this time around are the following:
1) Newer version of LINUX-CAPT-DRV-v271-UKEN: Download link as mentioned on the Canon Support page.
The tar ball includes 64-bit installers: cndrvcups-common_3.21-1_amd64.deb & cndrvcups-capt_2.71-1_amd64.deb (also includes rpm files & 32-bit versions).
2) Depdency on libglade2-0:
$ sudo apt-get install libglade2-0
3) Workaround for CAPT 64-bit OS issues linking to 32-bit libraries:
As mentioned in the earlier post, there are certain dependencies for CAPT running in 64-bit OS in linking to 32-bit libraries. Even though, the two deb files (mentioned above) get installed without the dependencies, peculiar behaviour/ error message is seen when the dependecies are missing. On the final step to View Status of printer on captstatusui, an error message is shown: "Check the DevicePath of /etc/ccpd.conf".
The solution to the problem is to simply check the missing dependencies:
(Check for error messages such as not executable, not found, etc.)
$ ldd /usr/bin/captfilter
$ ldd /usr/bin/capt* | sort | uniq | grep "not found"
In case of error install the missing dependencies:
$ sudo apt-get install libc6:i386 libpopt0:i386
$ sudo apt-get install zlib1g:i386 libxml2:i386 libstdc++6:i386
4) Follow the remaining instructions of the earlier post:
=> Install CAPT printer driver (downloaded .deb files mentioned above)
=> Add printer to the system: (Command line lpadmin or system-config-printer or UI System Settings > Printers):
=> Add printer to ccpadmin
=> View status of printer on captstatusui (should show "Ready to Print" if installed correctly)
Anatomy of an AI system is a real eye-opener. This helps us to get a high level view of the enormous complexity and scale of the supply chains, manufacturers, assemblers, miners, transporters and other links that collaborate at a global scale to help commercialize something like an Amazon ECHO device.
The authors explain how extreme exploitation of human labour, environment and resources that happen at various levels largely remain unacknowledged and unaccounted for. Right from mining of rare elements, to smelting and refining, to shipping and transportation, to component manufacture and assembly, etc. these mostly happen under in-human conditions with complete disregard for health, well-being, safety of workers who are given miserable wages. These processes also cause irreversible damage to the ecology and environment at large.
Though Amazon Echo as an AI powered self-learning device connected to cloud-based web-services opens up several privacy, safety, intrusion and digital exploitation concerns for the end-user, yet focusing solely on Echo would amount to missing the forest for the trees! Most issues highlighted here would be equally true of technologies from many other traditional and non-AI, or not-yet-AI, powered sectors like automobiles, electronics, telecom, etc. Time to give a thought to these issues and bring a stop to the irreversible damage to humans lives, well-being, finances, equality, and to the environment and planetary resources!
In the article titled "Field Notes #1 - Easy Does It" author Will Kurt highlights a key aspect of doing good Data Science - Simplicity. This includes first and foremost getting a good understanding of the problem to be solved. Later among the hypothesis & possible solutions/ models to favour the simpler ones. Atleast giving the simpler ones a fair/ equal chance at proving their worth in tests employing standardized performance metrics.
Another article of relevance for Data Scientists is from the allied domain of Stats titled "The 10 most common mistakes with statistics, and how to avoid them". The article based on the paper in eLife by the authors Makin and Orban de Xivry lists out the ten most common statistical mistakes in scientific research. The paper also includes tips for both the Reviewers to detect such mistakes and for Researchers (authors) to avoid them.
Many of the issues listed are linked to the p-value computations which is used to establish significance of statistical tests & draw conclusions from it. However, its incorrect usage, understanding, corrections, manipulation, etc. results in rendering the test ineffective and insignificant results getting reported. Issues of Sampling and adequate Control Groups along with faulty attempts by authors to establish Causation where none exists are also common in scientific literature.
As per the authors, these issues typically happen due to ineffective experimental designs, inappropriate analyses and/or flawed reasoning. A strong publication bias & pressure on researchers to publish significant results as opposed to correct but failed experiments makes matters worse. Moreover senior researchers entrusted to mentor juniors are often unfamiliar with fundamentals and prone to making these errors themselves. Their aversion to taking criticism becomes a further roadblock to improvement.
While correct mentoring of early stage researchers will certainly help, change can also come in by making science open access. Open science/ research must include details on all aspects of the study and all the materials involved such as data and analysis code. On the other hand, at the institutions and funders level incentivizing correctness over productivity can also prove beneficial.
A feature missing from the Bank/ FinTech value chain is Domestic Debit Card (DC) to Domestic Bank A/c (BA) transfer. With wide proliferation of debit cards, payment gateways and POS vendors all provide C2B payments through these channels. Debit card to Bank A/c transfer doesn't exist, which would instead be a C2C/ P2P provision (via regulated payment intermediaries).
There ought to be very valid reasons for the same such as the regulator disallowing it, security & fraud considerations, transaction fees & gateway charges, error handling & reversal mechanism, and so on.
Alternatives such as bank transfers provisions such as NEFT, RTGS, etc. and UPI mobile-apps based P2P transactions exist. Even for these modes all the issues mentioned above hold true yet the solutions were allowed to run and mature over time. So why not DC to BA?
If ever such a feature were to be rolled out by Banks/ FinTechs then all that customers would need is a single web page on the service provider/ Bank website to capture:
Next, the back-end gateway systems would:
That's about it!
As explained in the past, various safety features such as family shield filters from providers like OpenDNS , Cloudflare and others, DNS Over Https (DoH), HTTP Strict Transport Security (HSTS) can be used for a hassle free safe browsing across devices for members of the family. To additionally secure and regulate the usage for young kids Parental Control features and tools can be employed on devices and networks being accessed by children.
Parental Controls are available from day one across most device operating systems (OS) such as Android, iOS, and so on. All that the parent then needs to do, is to log in to the device using his/ her credentials and indicate to the device (OS) that the user of the device is a child and switch ON parental controls. Once that's done, the parental controls will get activated and only allow specific apps to run (apps white listed by the parent) while disallowing all others, and also filter out potentially harmful content from various sites and resources online.
Conceptually, that's pretty much all that there is to Parental Controls! For more info you can check out online resources such as these by Vodafone, VI and Google for a better understanding and setting-up parental controls to protect your kids online.
Ffmpeg is a fantastic video and audio converter to edit & create video & audio files. As is typical of *nix command line tools, ffmpeg has several options that need to be correctly configured to be able to use the tool properly to edit videos or audios.
Architecturally, ffmpeg works with streams of video, audio, images or other data files that are passed through various reader/ writer (demuxer/ muxer) and encoder/ decoder layers for the editing and creating video and audio files:
Image Credit: Official Ffmpeg Linux Manual
The command prompt may be a little overwhelming to start off, but a little playing with the tool shows reveals its immense potential. The official documentation page & Linux manual has a few examples to get you started.
Beyond this there are several online resources, blogs and articles such as this, this, this & this, etc. which have listed down the different ffmpeg commands with options. On the other hand, for those averse to the shell prompt, there are several GUI tools written on top of ffmpeg which can be explored.