Data Science Strategy: Mengatasi Perkembangan AI yang cepat

From OnnoWiki
Jump to navigation Jump to search

Managing the Rapid AI Technology Evolution and Lack of Standardization AI/ML technologies are constantly evolving and becoming more and more advanced. As computational efficiency increases, such technologies can now adjust to run on a smaller hardware footprint. These advances push analytics, ML, and AI realization also to the edge, meaning that an algorithm has computational support to run inside a device rather than that the device just provides data to the algorithm running remotely in the cloud, for example. That’s a good trend, because it will allow society to utilize machine intelligence to a broader extent across system environments and billions of mobile devices and other connected entities. However, one area is not keeping up with all the rapid changes: ML/AI standardization. The lack of standardization isn’t, of course, something that you or a single com- pany can solve, but it’s important to be aware of this situation as part of your data science strategy. And of course, at the end of the day, data scientists all have the responsibility to strive toward more standardization in machine learning and artificial intelligence. But just because no official, international standardization exists yet, it doesn’t mean that no initiatives exist. The standardizations that are out there are mostly based on something often referred to as de facto standardiza- tion, derived from influential open source initiatives coordinated through univer- sities like UC–Berkeley (AMP lab and RISE lab) and companies like Google (Google Beam) and AT&T (Acumos). Another trend that can be detected deals with increased concerns when it comes to access to (and usage of) personal information for nontransparent or even hid- den reasons. This has resulted in stricter legislation in different countries, but has also led to more ongoing discussions about the need to increase regulation and impose standardizations related to AI ethics. This positive trend will hopefully continue to push human society to better envision — as a group — what the future of AI utilization should look like. Of course, this trend has a downside. Because so little standardization is now available to lean on for your data science investment, you need to account for the possibility that you will have to make serious adjustments to your infrastructure when the new standards finally pop up in the near future. The worst-case sce- nario? You may have to repeat his process several times, or even totally remodel your entire infrastructure. My advice to you? Continuously follow trends, and be on the lookout for any indi- cation that new laws or regulations or standardization initiatives are coming down the pike — especially open source ones. And be prepared to adjust your data science approach to match the changes that are coming.