Visual Studio lays out roadmap for early 2019 releases

Microsoft is planning for the future of Visual Studio. In a newly released roadmap, the company detailed some of the things that will be coming to the IDE in Q1 of 2019.

As requested by the community, Visual Studio will be multi-monitor dots per inch (DPI) aware, which will lead to improved clarity on monitors with different DPIs. Some services will also be moved to the background to improve load times.

Xamarin developers will get features such as Xamarin.Forms 4.0 templates and tooling support, Xamarin.Android Designer improvements and support for constraint layouts, Xamarin.Forms Previewer improvements, Enhanced Fast Deployment for Xamarin.Android, and Android API 29 support.

Microsoft will update tooling for WinForms and WPF development with .NET Core 3, in addition to updating Test Explorer to provide better performance for large numbers of tests.

Python developers will get full featured debugging, an interactive window, and IntelliSense experience in Open Folder. They will also be able to take advantage of auto-reload when debugging Python Flask and Django apps.

Other features include the ability to add SQL Azure databases, Storage Accounts, Application Insights, and Key Vault to Azure App Service instances; support for running .NET Unit Tests on projects that target more than one .NET framework; extensibility support for third party test frameworks to integrate with Real Time Test Discovery; and x:bind support for XAML edit and continue.

The roadmap is available here.

The post Visual Studio lays out roadmap for early 2019 releases appeared first on SD Times.

via Click on the link for the full article

Angular 7.0.0 released

The latest major release of the mobile and desktop framework Angular is now available. Angular 7.0.0 makes updates to the entire platform, core framework, Angular Material and CLI as well as provides new features to the toolchain and several partner launches.

“Early adopters of v7 have reported that this update is faster than ever, and many apps take less than 10 minutes to update,” Stephen Fluin, developer advocate for Angular, wrote in a post.

One of the key updates in their release is the availability of CLI prompts. The CLI now has the ability to prompt users when using common commands and discover built-in features, according to Fluin.

The release also comes with a focus on performance. After reviewing common mistakes throughout the ecosystem, the team found developers were using reflect-metadata polyfill in production, when it should only be used in production. As a result, the update version of Angular will now remove this from polyfills.ts files and include it as a build step. Other performance updates includes Bundle BUdgets in the CLI that provide the ability to warn when an initial bundle is more than 2MB.

Minor visual updates have been added to Angular Material to reflect updates made to the Material Design specification. The CDK now includes virtual scrolling and drag and drop capabilities.

Additional features of the release include improved accessibility of selects, and support for content project in Angular Elements.

The team also announced a number of partner launches such as Angular Console, @angular/fire, NativeScript and StackBlitz 2.0

The post Angular 7.0.0 released appeared first on SD Times.

via Click on the link for the full article

SD Times news digest: Datalore 1.0, MIT’s smarter homes, and Ubuntu 18.10

JetBrains has announced the release of Datalore 1.0, which is an intelligent web application that can be used for data visualization and analysis in Python. Datalore provides data scientists with an intelligent Python code editor, intentions, incremental recalculations, collaborative features, and a version control system.

The 1.0 release introduces three major updates: the choice between on-the-go and user-controlled code execution, the ability to arrange the editor vertically or horizontally, and an upgraded professional plan.

MIT researchers are creating smarter homes
MIT researchers have developed a system to create more fully automated smart homes by identifying individual occupants and using reflected wireless signals to localize individuals. The system, Duet, also uses algorithms that ping nearby mobile devices to identify people, based on who last used the device and what their predicted movement trajectory is.

“Smart homes are still based on explicit input from apps or telling Alexa to do something. Ideally, we want homes to be more reactive to what we do, to adapt to us,” said Deepak Vasisht, a PhD student in MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) and lead author on a paper describing the system.

Ubuntu 18.10 is now available
Ubuntu 18.10 has been released, and has several updates that make it optimized for multi-cloud deployments and AI software development. It features a new community desktop theme, adding fingerprint unlock functionality for compatible PCs.

It also has a richer snap desktop integration, and now allows native desktop control to access files on the host system.

Waymo outlines its safety protocols
Autonomous driving company Waymo has released its first Safety Report, providing an overview of the company’s process for safely testing and deploying autonomous vehicle technology.

According to the report, each vehicle that is deployed is tested under a combination of simulation testing, closed-course testing, and real-world driving.

When an emergency or law enforcement vehicle is nearby, sensors can identify a fire truck, detect flashing lights, and hear sirens from hundred of feet away. The sensors are designed to be able to tell what direction sirens are coming from. When an emergency vehicle is detect, the self-driving vehicle will respond by yielding, pulling over, or coming to a complete stop, the report explained.

The post SD Times news digest: Datalore 1.0, MIT’s smarter homes, and Ubuntu 18.10 appeared first on SD Times.

via Click on the link for the full article

SD Times Open-Source Project of the Week: Infer.NET

Microsoft has announced its machine learning framework Infer.NET is now open source. Infer.NET focuses specifically on running Bayesian inference in graphical models. According to the company, it can also be used for probabilistic programming.

“Open sourcing Infer.NET represents the culmination of a long and ambitious journey. Our team at Microsoft Research in Cambridge, UK embarked on developing the framework back in 2004. We’ve learned a lot along the way about making machine learning solutions that are scalable and interpretable. Infer.NET initially was envisioned as a research tool and we released it for academic use in 2008. As a result, there have been hundreds of papers published using the framework across a variety of fields, everything from information retrieval to healthcare,” Yordan Zaykov, principal research software engineering lead, wrote in a post.

Microsoft believes the framework can be used to solve machine learning problems like classification, recommendations, and clustering, as well as customized solutions and domain-specific problems. It has been applied to a variety of different domains such as information retrieval, bioinformatics, epidemiology, and vision.2

“Infer.NET enables a model-based approach to machine learning. This lets you incorporate domain knowledge into your model. The framework can then build a bespoke machine learning algorithm directly from that model. This means that instead of having to map your problem onto a pre-existing learning algorithm that you’ve been given, Infer.NET actually constructs a learning algorithm for you, based on the model you’ve provided,” Zaykov wrote.

Zaykov explained a huge advantage of the framework is interpretability. “If you have designed the model yourself and the learning algorithm follows that model, then you can understand why the system behaves in a particular way or makes certain predictions. As machine learning applications gradually enter our lives, understanding and explaining their behavior becomes increasingly more important,” Zaykov wrote.

The best way to use Infer.NET is when you have extensive knowledge about a specific domain you are trying to solve or when you are interpreting the behavior of a system that is important to you, according to Zaykov.

Going forward, Infer.NET will become apart of Microsoft’s machine learning framework for .NET developers ML.NET. A repository under .NET Foundation is already being set up in order to provide integration with ML.NET.

The post SD Times Open-Source Project of the Week: Infer.NET appeared first on SD Times.

via Click on the link for the full article

What the Cloudera and Hortonworks merger means

Earlier this month, data companies Cloudera and Hortonworks announced that they would be entering into a merger. Both companies were leaders in the big data space.

The companies stated that they hoped the merger will enable them to become a next-generation data platform together.

Cloudera was originally co-founded in 2008 with talent from Google, Facebook, and Yahoo in an effort to deliver a big data platform that was built on Hadoop. Three years after that, Hortonworks was created with the same goal, explained Cloudera chief strategy officer Mike Olson in a post.

“Since 2011, our two companies have each innovated to build better products and win more business. In the competitive world of data management, we can each look with respect at the success of the other. I’m proud of what we’ve done at Cloudera, and I’m impressed at what the team has accomplished at Hortonworks,” Olson wrote.

In the years since the two companies were founded, the competitive landscape has changed significantly, and now includes a large number of companies.

According to Tendü Yoğurtçu, CTO of Syncsort, a close partner of both companies, the two companies have emerged as clear winners in the data space and gained momentum. “But each has had its own unique strengths, and been up against the challenges of an emerging data management platform. Hortonworks clearly put focus on IoT and streaming use cases whereas Cloudera put focus on data science use cases, strengthening the platform with machine learning and artificial intelligence. The merger is great news for consolidation of the market, bringing together the strengths of each company into one single entity that will be able to advance its solutions and the industry’s maturation far faster than either could alone, and to address the cloud and hybrid cloud market requirements.”

According to a post written by Rob Bearden, president and CEO of Hortonworks, the merger with Cloudera is the first step in the next chapter of the company’s evolution. “Cloudera has a like-minded approach to next generation data management and analytics solutions for hybrid deployments. Like Hortonworks, Cloudera believes data can drive high velocity business model transformations, and has innovated in ways that benefit the market and create new revenue opportunities. We are confident that our combined company will be ideally positioned to redefine the future of data as we extend our leadership and expand our offerings,” Bearden wrote.

Not all companies believe that the merger indicates a positive change for the two data giants. Many believe that the merger is an attempt for the two companies to stay alive in the changing data landscape.

Ashish Thusoo, CEO and co-founder of cloud-native data platform Qubole, thought that the merger was inevitable. “We are not surprised by this merger. We see it as something of a swan song for the two declining legacy players in the market, and proof that the on-prem data world is becoming obsolete. The market is evolving away from the Hadoop vendors – who haven’t been able to fulfill their promise to customers –  toward cloud native options that are much more nimble and better suited to solve the big data challenges of businesses.” he said.

“I can’t find any innovation benefits to customers in this merger,” said John Schroder, CEO and Chairman of the Board at MapR. “It is entirely about cost cutting and rationalization. This means their customers will suffer. MapR has been innovating and delivering a better data platform for years, and we continue to see Cloudera and Hortonworks customers move to MapR.”

What does the merger mean for customers?
According to Olson, the merger is not yet official and the deal will take several months to close. Bearden expects the transaction to close in the first quarter of 2019. Until then, the two companies will continue to operate separately.

Cloudera product CDH and Hortonworks products HDP and HDF will each be supported and maintained for at least three years after the merger closes.

Each company’s partners will have access to a larger partner community and will benefit from a single standard to build on as well as larger company with more customers, Olson explained.

Both Cloudera and Hortonworks released major versions of their platforms over the summer, which Olson claimed to be a “fortunate coincidence.” Both releases were based on current open-source project release, which means that the two development lines are closer than they have been in a while. The companies expect to be able to quickly combine the two into a single “unity” release soon after the deal is closed, Olson explained.

They will work to ensure that the single platform will include all of the features that the companies had separately developed, allowing customers to seamlessly upgrade from their current installations to the new product. Together, the companies will serve over 800 customers, Bearden wrote.

The companies will also work to quickly cross-port their unique product offerings — Hortonworks DataFlow, Cloudera Data Science Workbench (CDSW), and Cloudera’s Workload Experience Manager (Workload XM) —  so that they work with both of the platforms. This will ensure that customers will have access to “the innovative products of both,” explained Olson.

“By merging Cloudera’s investments in data warehousing and machine learning with Hortonworks’ investments in end-to-end data management, we are generating a winning combination, which will establish the standard for hybrid cloud data management,” said Bearden.

The post What the Cloudera and Hortonworks merger means appeared first on SD Times.

via Click on the link for the full article

PostgreSQL 11 now available

The PostgreSQL team has announced a major update to its open-source relational database. PostgreSQL 11 has been released with performance improvements, and specific enhancements for large databases and high computational workloads. According to the team, this is the first major release since last year when PostgreSQL 10 was released.

Additionally, the release includes updates to the table partitioning system, support for stored procedures, query parallelism improvements and parallelized data definition capabilities.

“For PostgreSQL 11, our development community focused on adding features that improve PostgreSQL’s ability to manage very large databases,” said Bruce Momjian, a core team member of the PostgreSQL global development group. “On top of PostgreSQL’s proven performance for transactional workloads, PostgreSQL 11 makes it even easier for developers to run big data applications at scale.”

The release adds the ability to partition data with a hash key or hash partitioning and works to improve data federation ability with functionality improvements for partitions. In addition, it introduces a catch-all default partition for data and supports moving rows to the correct partition. Other partitioning updates include improved query performance and support for the upsert feature on partitioned tables.

PostgreSQL 11 also adds transactions support in stored procedures. “Developers have been able to create user-defined functions in PostgreSQL for over 20 years, but prior to PostgreSQL 11, these functions were unable to manage their own transactions,” the team wrote in a post. With PostgreSQL 11’s SQL procedures and full transaction management capabilities, developers can now create advanced server-side apps that involve incremental bulk data loading, the team explained.

In addition, the release adds support for Just-In-Time compilation. This is designed to accelerate the execution of expressions during query execution, the team explained.

Other features include performance gains in parallel sequential scans, the ability to execute SELECT queries and functionality for working with windows functions.

The next upcoming release for PostgreSQL will be PostgreSQL 11.1, which will contain bug fixes. The next release with new features will be PostgreSQL 12.

The post PostgreSQL 11 now available appeared first on SD Times.

via Click on the link for the full article

SD Times news digest: Twilio Autopilot, Hazelcast Jet 0.7, and GitHub’s Octoprenticeship

Communications platform Twilio has announced Twilio Autopilot, which is a programmable conversational AI platform that can be used to build custom bots, IVRs, and home assistant apps.

Autopilot was designed to allow developers to write their applications once and deploy them to any support channel without having to write any additional code, according to the company.

“Machine learning is the most transformative technology of our time,” said Nico Acosta, director of product and engineering at Twilio. “However, until now, the tools available for building machine learning-powered conversational experiences have been too complex and not optimized for developers, which has led to poor customer experiences. We built Autopilot to make companies successful in building bots that delight users, instead of frustrating them.”

Hazelcast releases the first enterprise edition of Hazelcast Jet 0.7
Hazelcast has released the first enterprise edition of Hazelcast Jet 0.7. New enterprise features include a Management Center to provide greater visibility into Hazelcast Jet clusters and a Security Suite with end-to-end TLS encryption, mutual authentication with X509 certificates, and roles-based authorizations over data structures and actions.

Improvements to the open-source edition in the Jet 0.7 release include the ability to dynamically re-scale to adapt to workload changes, more friendly data management, and a new JDBC connector to allow Jet to process data from a majority of databases.

GitHub launches apprenticeship program
GitHub has launched its first ever apprenticeship program. The Octoprenticeship will aim to lower the barrier to entry into tech. Apprentices will work on real GitHub projects and partner with a mentor throughout the program, develop professional skills, and gain additional learning opportunities and resources throughout the program.

“It can be particularly difficult for individuals with non-traditional work and educational backgrounds to find full-time roles. For every computer science graduate we hire, we know there are others who could also make a significant impact at GitHub: people whose work experience is primarily outside of tech and are looking to pivot into the industry; people who have taken time off for caregiving and are coming back into the workforce; and people who don’t have a traditional developer or engineering background (perhaps they are self-taught or participated in a coding program,” GitHub wrote in a post.

Google reveals new security chip, Titan M
Google has revealed a new security chip that it will include in new Pixel 3 and Pixel 3 XL devices. Titan M is a low-power security module that provides several functions.

Titan M is used to store and enforce locks and rollback counters used by Android Verified Boot, store secrets, provide backing for the Android Strongbox Keymaster module, enforce factory reset policies, and ensure that Google can’t unlock a phone or install firmware updates without the user’s authorization.

Titan M’s processor, caches, memory, and persistent storage are not shared with the rest of the phone’s system, blocking side channel attacks such as Spectre and Meltdown.

“As more of our lives are bound up in our phones, keeping those phones secure and trustworthy is increasingly important. Google takes that responsibility seriously. Titan M is just the latest step in our continuing efforts to improve the privacy and security of all our users,” the Android team wrote in a post.

The post SD Times news digest: Twilio Autopilot, Hazelcast Jet 0.7, and GitHub’s Octoprenticeship appeared first on SD Times.

via Click on the link for the full article

MongoDB introduces the Server Side Public License for open source

Open-source developers are tired of being taken advantage of by technology giants. Larger companies with practically unlimited resources are swooping into open-source projects, leveraging the work for their own monetary gain, and leaving smaller companies to fend for themselves.

Recently, a group of disgruntled developers and companies took to the Commons Clause as a way to protect their open-source work. However, this caused great controversy within the open-source industry because the clause added restrictions to open-source licenses, therefore violating the accepted definition of open source as well as the guidelines for the Open Source Initiative’s (OSI) approved open-source licenses, according to Vicky Brasseur, vice president of the OSI.

However, another company is coming in to try to address the problem with the creation of a new license. MongoDB has announced the Server Side Public License (SSPL) as a way to continue to make MongoDB publicly available without having to worry about costly litigations or other open-source problems. MongoDB currently offers a free and open-source cross-platform document-oriented database solution. The license will be applied to the company’s community server. Any companies who run publicly available versions of MongoDB as a service or any software under the SSPL will have to either obtain a commercial license from MongoDB or open source the software it uses, according to MongoDB.

“This should be a time of incredible opportunity for open source. The revenue generated by a service can be a great source of funding for open source projects, far greater than what has historically been available. The reality, however, is that once an open source project becomes interesting, it is too easy for large cloud vendors to capture most of the value while contributing little or nothing back to the community. As a result, smaller companies are understandably unwilling to wager their existence against the strategic interests of the large cloud vendors, and most new software is being written as closed source,” Eliot Horowitz, CTO and co-founder of MongoDB, wrote in a blog post announcing the license.

According to Horowitz, SSPL was created as a need to replace or address issues with the GNU APGLv3 (AGPL). He explained this license is currently the best option for open-source companies to license their software; however, it requires a management stack to operate the software as a service and that stack must also be made available under the AGPL.

“This approach was believed to be good enough, as most people understood their obligations to comply with AGPL. However, as AGPL-licensed software like MongoDB has become more popular, organizations like the international cloud providers have begun to test the boundaries of this license. We would prefer to avoid litigation to defend the AGPL but instead devote our time to build great products for the community and our customers,” wrote Horowitz. “The community needs an updated open source license that builds on the spirit of the AGPL, but makes explicit the conditions for providing the licensed software as a service.”

According to MongoDB, the SSPL builds on the AGPL, but is designed to clarify the conditions for providing open-source software as a service. In addition, the license will encompass the same AGPL freedoms such as the freedom to use, review, modify and redistribute the software. “The only substantive change is an explicit condition that any organization attempting to exploit MongoDB as a service must open source the software that it uses to offer such service,” the company explained in its announcement of the license.

The idea of the SSPL differs from the Commons Clause strategy because it is based off the spirit of the AGPL. MongoDB explained that it has submitted the license to OSI for approval, and expects it will meet the open-source criteria as defined by the OSI.

SD Times reached out to the OSI for comment, but had not heard back as of the time of this writing.

The license officially went into effect on Oct. 16. Any MongoDB Community Server patch releases and versions released on or after that date will be subject to the new license, the company explained. This will include future patch releases of older versions.

“We are big believers in open source. It leads to more valuable, robust and secure software. However, it is important that open source licenses evolve to keep pace with the changes in our industry,” said Dev Ittycheria, president and CEO, MongoDB. “We have invested approximately $300M in R&D over the past decade to offer a modern, general purpose, open source database for everyone. With the added protection of the SSPL, we can continue to invest in R&D and further drive innovation and value for the community.”

The post MongoDB introduces the Server Side Public License for open source appeared first on SD Times.

via Click on the link for the full article

Electric Cloud’s ElectricFlow 8.5 features “DevOps your way”

There are many tools and ways of doing DevOps. Electric Cloud wants to make sure businesses have the flexibility to use the tools and processes that work best for them in its latest release of its enterprise SaaS and on-premises Application Release Orchestration solution ElectricFlow.

ElectricFlow 8.5 features a new Kanban-style pipeline view, CI dashboarding capabilties and object tagging for customer reporting and improved searchability.

“When we say your way, we literally mean it. Maybe you are happy with your process. Maybe you are not. Maybe you want to evolve. We allow you to change your process as part of the process or tool, but we won’t force you to,” said Anders Wallgren, CTO of Electric Cloud. “It becomes the platform you evolve and improve, and be flexible in the way you do your software pipeline.”

The new Kanban view shows the entire release cycle with all the stages in one screen. It can tell uses where they are, what happened and what is about to happen. Then, for more details, users can switch back to the pipeline view to see what is running where and when, and what team was tasked with doing.

The Kanban view was added out of the need to provide release managers more about the process and less about the automation. “It is a higher level of abstraction for release managers, but still allows them to drill down into details if they need to,” said Wallgren.

The new CI dashboarding capabilties provide data-driven visibility into CI processes and bottlenecks. This will provide a clearer view into builds, according to Wallgren. In addition, it tracks all of a team’s CI processes no matter the tools they are using. It supports CI tools such as Jenkins, Bamboo and Git.

The newly added object tagging functionality provides more granular, real-world reports for release managers and project owners, according to the company. You can now create reports with universal data segmentation across any object types such as applications, pipeline stages and environments.

In addition, ElectricFlow now features support for SaaS teams in on-premise, cloud native and serverless environments.

The post Electric Cloud’s ElectricFlow 8.5 features “DevOps your way” appeared first on SD Times.

via Click on the link for the full article

Atlassian rebuilds Jira for modern software development

Atlassian today is releasing a completely overhauled and rebuilt version of its Jira project management software from the points of view of permissions, navigation and user experience.

“The Jira you needed in 2002 and the workflow you built in 2010 and the permissions model we put together back then isn’t the one we need today,” said Sean Regan, head of growth for software teams at Atlassian. “And that’s really driven by this change in the market around modern software development,” including the adoption of cloud computing, microservices and containers, as well as organizations empowering autonomous development teams.

“You don’t get too many chances to formulate a new foundation to your product. This is the best chance we’ve ever had to do open-heart surgery on Jira,” he added. “That moment was when we made the decision to split our product into a cloud version and a server-based version. We made that decision a couple of years ago, and when we made that decision, we moved our cloud version to AWS. And that was the moment we cracked it open and took a really hard look at the guts of the product, and that drove a lot of changes.”

Among the changes, Atlassian opened up new APIs, including one for feature flag integration. Already, companies such as LaunchDarkly, Optimizely and Rollout have integrated with Jira. “Feature flags and Jira moves [companies] from software factories to labs. Developers get autonomy but executives can see the rollouts aren’t crushing users,” said Regan. 

One of the key new features in Jira is a product roadmap, which “mirrors the flexibility and customization Jira always had but in one single view,” said Jake Brereton, head of marketing for software cloud at Atlassian. As Regan said, “When you have 10 different teams, all shipping and releasing and testing on different schedules, nobody know what’s going on. It’s mayhem. It’s like writing a book with a different author for every paragraph. It all has to come together and work.”

With the new roadmaps, teams can share their task statuses with their internal stakeholders, so they can see who is working on what task.

Work has been done on project boards in Jira as well. Epics are listed and can be drilled into to see the status of work items. Boards have been re-engineered to enable drag-and-drop workflows, filters and progressions that in the past required developers to write Jira Query Language statements to create. Worfklows, issue types and fields can be customized, without the need to get developers involved. Jira issue functions, the core unit of work, can also be customized to show developers only what they need.

The notions of units of work and developer autonomy are reflected in how the new Jira offers feature choices. “We took the idea of templates and ripped out the features and functionality to turn them on or off,” Brereton said. “With autonomy, teams don’t want strict Scrum or Kanban processes. Let the teams figure out what’s best for them. Kanban zealots say there are no backlogs, but if you want, in Jira you can work with a Kanban board and backlogs.”

Regan explained that the philosophical underpinnings of the Jira changes are based in the fight between developer autonomy and management control. Managers were using Jira to enforce ‘really ridiculous’ process and control, he said. “Atlassian and Jira are siding with developer autonomy,” he said. “We want developers to be able to design the way they want to work, any workflow, build their own board, design their own issue, give the freedom to develop cool stuff, but make sure the project managers and administrators and executives have enough of the reporting and enough of structure to feel comfortable with that freedom. The Jira of the past allowed administrators to be too restrictive. We’re trying to set an example with the way we’ve designed the product in favor of developer autonomy.”

The post Atlassian rebuilds Jira for modern software development appeared first on SD Times.

via Click on the link for the full article